49116 1727204676.26329: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-MVC executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 49116 1727204676.27472: Added group all to inventory 49116 1727204676.27475: Added group ungrouped to inventory 49116 1727204676.27480: Group all now contains ungrouped 49116 1727204676.27484: Examining possible inventory source: /tmp/network-jrl/inventory-0Xx.yml 49116 1727204676.59392: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 49116 1727204676.59462: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 49116 1727204676.59697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 49116 1727204676.59819: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 49116 1727204676.60003: Loaded config def from plugin (inventory/script) 49116 1727204676.60005: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 49116 1727204676.60054: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 49116 1727204676.60400: Loaded config def from plugin (inventory/yaml) 49116 1727204676.60402: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 49116 1727204676.60713: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 49116 1727204676.61339: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 49116 1727204676.61344: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 49116 1727204676.61347: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 49116 1727204676.61353: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 49116 1727204676.61358: Loading data from /tmp/network-jrl/inventory-0Xx.yml 49116 1727204676.61434: /tmp/network-jrl/inventory-0Xx.yml was not parsable by auto 49116 1727204676.61510: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 49116 1727204676.61790: Loading data from /tmp/network-jrl/inventory-0Xx.yml 49116 1727204676.61888: group all already in inventory 49116 1727204676.61896: set inventory_file for managed-node1 49116 1727204676.61901: set inventory_dir for managed-node1 49116 1727204676.61902: Added host managed-node1 to inventory 49116 1727204676.61904: Added host managed-node1 to group all 49116 1727204676.61905: set ansible_host for managed-node1 49116 1727204676.61906: set ansible_ssh_extra_args for managed-node1 49116 1727204676.61909: set inventory_file for managed-node2 49116 1727204676.61912: set inventory_dir for managed-node2 49116 1727204676.61913: Added host managed-node2 to inventory 49116 1727204676.61914: Added host managed-node2 to group all 49116 1727204676.61915: set ansible_host for managed-node2 49116 1727204676.61916: set ansible_ssh_extra_args for managed-node2 49116 1727204676.61918: set inventory_file for managed-node3 49116 1727204676.61921: set inventory_dir for managed-node3 49116 1727204676.61922: Added host managed-node3 to inventory 49116 1727204676.61923: Added host managed-node3 to group all 49116 1727204676.61924: set ansible_host for managed-node3 49116 1727204676.61925: set ansible_ssh_extra_args for managed-node3 49116 1727204676.61928: Reconcile groups and hosts in inventory. 49116 1727204676.61932: Group ungrouped now contains managed-node1 49116 1727204676.61934: Group ungrouped now contains managed-node2 49116 1727204676.61936: Group ungrouped now contains managed-node3 49116 1727204676.62239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 49116 1727204676.62555: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 49116 1727204676.62672: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 49116 1727204676.62707: Loaded config def from plugin (vars/host_group_vars) 49116 1727204676.62710: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 49116 1727204676.62718: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 49116 1727204676.62728: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 49116 1727204676.63140: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 49116 1727204676.63646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204676.63756: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 49116 1727204676.63805: Loaded config def from plugin (connection/local) 49116 1727204676.63809: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 49116 1727204676.64577: Loaded config def from plugin (connection/paramiko_ssh) 49116 1727204676.64582: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 49116 1727204676.65904: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 49116 1727204676.65950: Loaded config def from plugin (connection/psrp) 49116 1727204676.65955: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 49116 1727204676.67051: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 49116 1727204676.67101: Loaded config def from plugin (connection/ssh) 49116 1727204676.67105: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 49116 1727204676.70920: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 49116 1727204676.71223: Loaded config def from plugin (connection/winrm) 49116 1727204676.71228: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 49116 1727204676.71299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 49116 1727204676.71384: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 49116 1727204676.71460: Loaded config def from plugin (shell/cmd) 49116 1727204676.71463: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 49116 1727204676.71515: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 49116 1727204676.71592: Loaded config def from plugin (shell/powershell) 49116 1727204676.71594: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 49116 1727204676.71657: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 49116 1727204676.72272: Loaded config def from plugin (shell/sh) 49116 1727204676.72275: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 49116 1727204676.72316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 49116 1727204676.72462: Loaded config def from plugin (become/runas) 49116 1727204676.72647: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 49116 1727204676.72863: Loaded config def from plugin (become/su) 49116 1727204676.73074: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 49116 1727204676.73334: Loaded config def from plugin (become/sudo) 49116 1727204676.73337: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 49116 1727204676.73381: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml 49116 1727204676.73780: in VariableManager get_vars() 49116 1727204676.73804: done with get_vars() 49116 1727204676.73953: trying /usr/local/lib/python3.12/site-packages/ansible/modules 49116 1727204676.79100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 49116 1727204676.79435: in VariableManager get_vars() 49116 1727204676.79441: done with get_vars() 49116 1727204676.79444: variable 'playbook_dir' from source: magic vars 49116 1727204676.79445: variable 'ansible_playbook_python' from source: magic vars 49116 1727204676.79446: variable 'ansible_config_file' from source: magic vars 49116 1727204676.79447: variable 'groups' from source: magic vars 49116 1727204676.79448: variable 'omit' from source: magic vars 49116 1727204676.79448: variable 'ansible_version' from source: magic vars 49116 1727204676.79449: variable 'ansible_check_mode' from source: magic vars 49116 1727204676.79450: variable 'ansible_diff_mode' from source: magic vars 49116 1727204676.79450: variable 'ansible_forks' from source: magic vars 49116 1727204676.79451: variable 'ansible_inventory_sources' from source: magic vars 49116 1727204676.79452: variable 'ansible_skip_tags' from source: magic vars 49116 1727204676.79452: variable 'ansible_limit' from source: magic vars 49116 1727204676.79453: variable 'ansible_run_tags' from source: magic vars 49116 1727204676.79454: variable 'ansible_verbosity' from source: magic vars 49116 1727204676.79516: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml 49116 1727204676.80101: in VariableManager get_vars() 49116 1727204676.80121: done with get_vars() 49116 1727204676.80163: in VariableManager get_vars() 49116 1727204676.80181: done with get_vars() 49116 1727204676.80220: in VariableManager get_vars() 49116 1727204676.80242: done with get_vars() 49116 1727204676.80401: in VariableManager get_vars() 49116 1727204676.80417: done with get_vars() 49116 1727204676.80422: variable 'omit' from source: magic vars 49116 1727204676.80443: variable 'omit' from source: magic vars 49116 1727204676.80482: in VariableManager get_vars() 49116 1727204676.80494: done with get_vars() 49116 1727204676.80547: in VariableManager get_vars() 49116 1727204676.80561: done with get_vars() 49116 1727204676.80603: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 49116 1727204676.80868: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 49116 1727204676.81016: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 49116 1727204676.81912: in VariableManager get_vars() 49116 1727204676.81938: done with get_vars() 49116 1727204676.82519: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 49116 1727204676.82730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 49116 1727204676.85599: in VariableManager get_vars() 49116 1727204676.85623: done with get_vars() 49116 1727204676.85967: in VariableManager get_vars() 49116 1727204676.86017: done with get_vars() 49116 1727204676.86144: in VariableManager get_vars() 49116 1727204676.86162: done with get_vars() 49116 1727204676.86350: variable 'omit' from source: magic vars 49116 1727204676.86366: variable 'omit' from source: magic vars 49116 1727204676.86408: in VariableManager get_vars() 49116 1727204676.86426: done with get_vars() 49116 1727204676.86451: in VariableManager get_vars() 49116 1727204676.86472: done with get_vars() 49116 1727204676.86511: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 49116 1727204676.86779: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 49116 1727204676.86871: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 49116 1727204676.87474: in VariableManager get_vars() 49116 1727204676.87502: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 49116 1727204676.90463: in VariableManager get_vars() 49116 1727204676.90494: done with get_vars() 49116 1727204676.90537: in VariableManager get_vars() 49116 1727204676.90570: done with get_vars() 49116 1727204676.90639: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 49116 1727204676.90662: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 49116 1727204676.94117: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 49116 1727204676.94315: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 49116 1727204676.94319: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-MVC/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 49116 1727204676.94360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 49116 1727204676.94401: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 49116 1727204676.94613: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 49116 1727204676.94685: Loaded config def from plugin (callback/default) 49116 1727204676.94689: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 49116 1727204676.96815: Loaded config def from plugin (callback/junit) 49116 1727204676.96819: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 49116 1727204676.96961: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 49116 1727204676.97158: Loaded config def from plugin (callback/minimal) 49116 1727204676.97161: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 49116 1727204676.97277: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 49116 1727204676.97443: Loaded config def from plugin (callback/tree) 49116 1727204676.97446: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 49116 1727204676.97747: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 49116 1727204676.97750: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-MVC/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_vlan_mtu_nm.yml ************************************************ 2 plays in /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml 49116 1727204676.97850: in VariableManager get_vars() 49116 1727204676.97940: done with get_vars() 49116 1727204676.97949: in VariableManager get_vars() 49116 1727204676.97960: done with get_vars() 49116 1727204676.97964: variable 'omit' from source: magic vars 49116 1727204676.98127: in VariableManager get_vars() 49116 1727204676.98149: done with get_vars() 49116 1727204676.98177: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_vlan_mtu.yml' with nm as provider] ********* 49116 1727204676.99849: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 49116 1727204676.99938: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 49116 1727204677.00197: getting the remaining hosts for this loop 49116 1727204677.00199: done getting the remaining hosts for this loop 49116 1727204677.00204: getting the next task for host managed-node3 49116 1727204677.00208: done getting next task for host managed-node3 49116 1727204677.00211: ^ task is: TASK: Gathering Facts 49116 1727204677.00212: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204677.00215: getting variables 49116 1727204677.00216: in VariableManager get_vars() 49116 1727204677.00231: Calling all_inventory to load vars for managed-node3 49116 1727204677.00237: Calling groups_inventory to load vars for managed-node3 49116 1727204677.00240: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204677.00255: Calling all_plugins_play to load vars for managed-node3 49116 1727204677.00271: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204677.00275: Calling groups_plugins_play to load vars for managed-node3 49116 1727204677.00430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204677.00669: done with get_vars() 49116 1727204677.00680: done getting variables 49116 1727204677.01005: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:6 Tuesday 24 September 2024 15:04:37 -0400 (0:00:00.035) 0:00:00.035 ***** 49116 1727204677.01038: entering _queue_task() for managed-node3/gather_facts 49116 1727204677.01040: Creating lock for gather_facts 49116 1727204677.01868: worker is 1 (out of 1 available) 49116 1727204677.01880: exiting _queue_task() for managed-node3/gather_facts 49116 1727204677.01899: done queuing things up, now waiting for results queue to drain 49116 1727204677.01901: waiting for pending results... 49116 1727204677.02102: running TaskExecutor() for managed-node3/TASK: Gathering Facts 49116 1727204677.02231: in run() - task 127b8e07-fff9-02f7-957b-0000000000af 49116 1727204677.02296: variable 'ansible_search_path' from source: unknown 49116 1727204677.02355: calling self._execute() 49116 1727204677.02452: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204677.02464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204677.02493: variable 'omit' from source: magic vars 49116 1727204677.02671: variable 'omit' from source: magic vars 49116 1727204677.02683: variable 'omit' from source: magic vars 49116 1727204677.02758: variable 'omit' from source: magic vars 49116 1727204677.02822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204677.02877: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204677.02903: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204677.03043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204677.03051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204677.03054: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204677.03057: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204677.03059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204677.03185: Set connection var ansible_connection to ssh 49116 1727204677.03205: Set connection var ansible_timeout to 10 49116 1727204677.03220: Set connection var ansible_shell_executable to /bin/sh 49116 1727204677.03230: Set connection var ansible_pipelining to False 49116 1727204677.03240: Set connection var ansible_shell_type to sh 49116 1727204677.03251: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204677.03298: variable 'ansible_shell_executable' from source: unknown 49116 1727204677.03314: variable 'ansible_connection' from source: unknown 49116 1727204677.03323: variable 'ansible_module_compression' from source: unknown 49116 1727204677.03342: variable 'ansible_shell_type' from source: unknown 49116 1727204677.03377: variable 'ansible_shell_executable' from source: unknown 49116 1727204677.03474: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204677.03478: variable 'ansible_pipelining' from source: unknown 49116 1727204677.03485: variable 'ansible_timeout' from source: unknown 49116 1727204677.03489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204677.03730: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204677.03801: variable 'omit' from source: magic vars 49116 1727204677.03804: starting attempt loop 49116 1727204677.03811: running the handler 49116 1727204677.03814: variable 'ansible_facts' from source: unknown 49116 1727204677.03822: _low_level_execute_command(): starting 49116 1727204677.03839: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204677.04927: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204677.04973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204677.05004: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204677.05090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204677.05239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204677.07061: stdout chunk (state=3): >>>/root <<< 49116 1727204677.07283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204677.07288: stdout chunk (state=3): >>><<< 49116 1727204677.07291: stderr chunk (state=3): >>><<< 49116 1727204677.07316: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204677.07384: _low_level_execute_command(): starting 49116 1727204677.07389: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204677.0732348-49236-70937275087786 `" && echo ansible-tmp-1727204677.0732348-49236-70937275087786="` echo /root/.ansible/tmp/ansible-tmp-1727204677.0732348-49236-70937275087786 `" ) && sleep 0' 49116 1727204677.08221: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204677.08247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204677.08288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204677.08425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204677.10593: stdout chunk (state=3): >>>ansible-tmp-1727204677.0732348-49236-70937275087786=/root/.ansible/tmp/ansible-tmp-1727204677.0732348-49236-70937275087786 <<< 49116 1727204677.10736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204677.10889: stderr chunk (state=3): >>><<< 49116 1727204677.10893: stdout chunk (state=3): >>><<< 49116 1727204677.10952: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204677.0732348-49236-70937275087786=/root/.ansible/tmp/ansible-tmp-1727204677.0732348-49236-70937275087786 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204677.10960: variable 'ansible_module_compression' from source: unknown 49116 1727204677.11014: ANSIBALLZ: Using generic lock for ansible.legacy.setup 49116 1727204677.11017: ANSIBALLZ: Acquiring lock 49116 1727204677.11020: ANSIBALLZ: Lock acquired: 139720119767104 49116 1727204677.11022: ANSIBALLZ: Creating module 49116 1727204677.90791: ANSIBALLZ: Writing module into payload 49116 1727204677.91085: ANSIBALLZ: Writing module 49116 1727204677.91127: ANSIBALLZ: Renaming module 49116 1727204677.91524: ANSIBALLZ: Done creating module 49116 1727204677.91528: variable 'ansible_facts' from source: unknown 49116 1727204677.91534: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204677.91538: _low_level_execute_command(): starting 49116 1727204677.91545: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 49116 1727204677.92874: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204677.92981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204677.93392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204677.93503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204677.95510: stdout chunk (state=3): >>>PLATFORM Linux <<< 49116 1727204677.95515: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 49116 1727204677.95987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204677.96025: stderr chunk (state=3): >>><<< 49116 1727204677.96029: stdout chunk (state=3): >>><<< 49116 1727204677.96057: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204677.96069 [managed-node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 49116 1727204677.96116: _low_level_execute_command(): starting 49116 1727204677.96119: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 49116 1727204677.96499: Sending initial data 49116 1727204677.96503: Sent initial data (1181 bytes) 49116 1727204677.97790: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204677.97842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204677.97891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204677.98026: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204677.98101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204678.02112: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} <<< 49116 1727204678.02516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204678.02774: stderr chunk (state=3): >>><<< 49116 1727204678.02778: stdout chunk (state=3): >>><<< 49116 1727204678.02871: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204678.03014: variable 'ansible_facts' from source: unknown 49116 1727204678.03023: variable 'ansible_facts' from source: unknown 49116 1727204678.03038: variable 'ansible_module_compression' from source: unknown 49116 1727204678.03141: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 49116 1727204678.03220: variable 'ansible_facts' from source: unknown 49116 1727204678.03749: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204677.0732348-49236-70937275087786/AnsiballZ_setup.py 49116 1727204678.03987: Sending initial data 49116 1727204678.03998: Sent initial data (153 bytes) 49116 1727204678.05414: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204678.05435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204678.05597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204678.05686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204678.06196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204678.07982: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204678.08046: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204678.08130: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204677.0732348-49236-70937275087786/AnsiballZ_setup.py" <<< 49116 1727204678.08134: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpvoshnwsf /root/.ansible/tmp/ansible-tmp-1727204677.0732348-49236-70937275087786/AnsiballZ_setup.py <<< 49116 1727204678.08226: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpvoshnwsf" to remote "/root/.ansible/tmp/ansible-tmp-1727204677.0732348-49236-70937275087786/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204677.0732348-49236-70937275087786/AnsiballZ_setup.py" <<< 49116 1727204678.11577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204678.11582: stderr chunk (state=3): >>><<< 49116 1727204678.11585: stdout chunk (state=3): >>><<< 49116 1727204678.11706: done transferring module to remote 49116 1727204678.11833: _low_level_execute_command(): starting 49116 1727204678.11838: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204677.0732348-49236-70937275087786/ /root/.ansible/tmp/ansible-tmp-1727204677.0732348-49236-70937275087786/AnsiballZ_setup.py && sleep 0' 49116 1727204678.13474: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204678.13479: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204678.13497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204678.13744: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204678.13751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204678.13823: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204678.13987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204678.16091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204678.16145: stderr chunk (state=3): >>><<< 49116 1727204678.16228: stdout chunk (state=3): >>><<< 49116 1727204678.16248: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204678.16257: _low_level_execute_command(): starting 49116 1727204678.16269: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204677.0732348-49236-70937275087786/AnsiballZ_setup.py && sleep 0' 49116 1727204678.17763: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204678.17896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204678.17900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 49116 1727204678.17903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49116 1727204678.17906: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204678.17908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204678.18101: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204678.18105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204678.18211: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204678.20723: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 49116 1727204678.20747: stdout chunk (state=3): >>>import _imp # builtin <<< 49116 1727204678.20802: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 49116 1727204678.20851: stdout chunk (state=3): >>>import '_io' # <<< 49116 1727204678.20911: stdout chunk (state=3): >>>import 'marshal' # import 'posix' # <<< 49116 1727204678.20935: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 49116 1727204678.20970: stdout chunk (state=3): >>>import 'time' # <<< 49116 1727204678.20973: stdout chunk (state=3): >>>import 'zipimport' # <<< 49116 1727204678.20976: stdout chunk (state=3): >>># installed zipimport hook <<< 49116 1727204678.21119: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 49116 1727204678.21227: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 49116 1727204678.21231: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b8118530> <<< 49116 1727204678.21249: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b80e7b30> <<< 49116 1727204678.21277: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b811aab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # <<< 49116 1727204678.21304: stdout chunk (state=3): >>>import '_stat' # <<< 49116 1727204678.21314: stdout chunk (state=3): >>>import 'stat' # <<< 49116 1727204678.21412: stdout chunk (state=3): >>>import '_collections_abc' # <<< 49116 1727204678.21459: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 49116 1727204678.21694: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7ecd190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204678.21700: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7ece090> <<< 49116 1727204678.21722: stdout chunk (state=3): >>>import 'site' # <<< 49116 1727204678.21753: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 49116 1727204678.22163: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 49116 1727204678.22249: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 49116 1727204678.22286: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 49116 1727204678.22299: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 49116 1727204678.22335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 49116 1727204678.22358: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f0be60> <<< 49116 1727204678.22479: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f0bf20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 49116 1727204678.22483: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 49116 1727204678.22499: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 49116 1727204678.22549: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204678.22579: stdout chunk (state=3): >>>import 'itertools' # <<< 49116 1727204678.22598: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f43830> <<< 49116 1727204678.22683: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f43ec0> import '_collections' # <<< 49116 1727204678.22749: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f23b30> import '_functools' # <<< 49116 1727204678.22761: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f21250> <<< 49116 1727204678.22867: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f09010> <<< 49116 1727204678.22912: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 49116 1727204678.23013: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 49116 1727204678.23047: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f67800> <<< 49116 1727204678.23063: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f66420> <<< 49116 1727204678.23091: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f22120> <<< 49116 1727204678.23118: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f64c50> <<< 49116 1727204678.23244: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f98890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f082c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 49116 1727204678.23248: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7f98d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f98bf0> <<< 49116 1727204678.23297: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204678.23301: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7f98fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f06de0> <<< 49116 1727204678.23453: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 49116 1727204678.23472: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f99670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f99340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f9a570> <<< 49116 1727204678.23711: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7fb47a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7fb5ee0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py<<< 49116 1727204678.23738: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 49116 1727204678.23741: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7fb6d80> <<< 49116 1727204678.23782: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7fb73e0> <<< 49116 1727204678.23887: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7fb62d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204678.23899: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7fb7e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7fb7560> <<< 49116 1727204678.23940: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f9a5d0> <<< 49116 1727204678.23956: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 49116 1727204678.23982: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 49116 1727204678.24116: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7cf3da0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 49116 1727204678.24132: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7d1c7d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7d1c530> <<< 49116 1727204678.24197: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7d1c800> <<< 49116 1727204678.24200: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7d1c9e0> <<< 49116 1727204678.24238: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7cf1f40> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 49116 1727204678.24500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7d1dfd0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7d1cc80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f9acc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 49116 1727204678.24568: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204678.24573: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 49116 1727204678.24757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7d4a360> <<< 49116 1727204678.24773: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 49116 1727204678.24806: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7d62510> <<< 49116 1727204678.24834: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 49116 1727204678.24873: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 49116 1727204678.24988: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7d9b260> <<< 49116 1727204678.24991: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 49116 1727204678.25099: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 49116 1727204678.25103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 49116 1727204678.25315: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7dc5a00> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7d9b380> <<< 49116 1727204678.25384: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7d631a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7bb0380> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7d61550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7d1ef00> <<< 49116 1727204678.25568: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 49116 1727204678.25638: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f63b7d61670> <<< 49116 1727204678.26028: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_lsdwyrym/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 49116 1727204678.26059: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.26100: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 49116 1727204678.26103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 49116 1727204678.26169: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 49116 1727204678.26284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 49116 1727204678.26333: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7c16000> <<< 49116 1727204678.26420: stdout chunk (state=3): >>>import '_typing' # <<< 49116 1727204678.26673: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7becef0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7bb3f80> <<< 49116 1727204678.26728: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.26762: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available <<< 49116 1727204678.26796: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 49116 1727204678.29293: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.30927: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7befe90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 49116 1727204678.31023: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7c49a90> <<< 49116 1727204678.31095: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7c49820> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7c49130> <<< 49116 1727204678.31099: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 49116 1727204678.31247: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7c49b80> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7c16c90> import 'atexit' # <<< 49116 1727204678.31313: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7c4a7e0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7c4aa20> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 49116 1727204678.31521: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7c4af30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 49116 1727204678.31592: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7aaccb0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7aae8d0> <<< 49116 1727204678.31596: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 49116 1727204678.31663: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 49116 1727204678.31680: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7aaf1a0> <<< 49116 1727204678.31773: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 49116 1727204678.31778: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7ab0350> <<< 49116 1727204678.31811: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 49116 1727204678.31842: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 49116 1727204678.31882: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 49116 1727204678.32142: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7ab2e10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7ab2f30> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7ab10d0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 49116 1727204678.32347: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 49116 1727204678.32428: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7ab6db0> import '_tokenize' # <<< 49116 1727204678.32461: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7ab5880> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7ab55e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 49116 1727204678.32572: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7ab7d70> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7ab15e0> <<< 49116 1727204678.32607: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7afaf30> <<< 49116 1727204678.32643: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7afb0e0> <<< 49116 1727204678.32675: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 49116 1727204678.32731: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 49116 1727204678.32810: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7b04ce0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7b04aa0> <<< 49116 1727204678.32844: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 49116 1727204678.33023: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 49116 1727204678.33079: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7b07200> <<< 49116 1727204678.33119: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7b053d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 49116 1727204678.33186: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204678.33235: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 49116 1727204678.33263: stdout chunk (state=3): >>>import '_string' # <<< 49116 1727204678.33329: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7b0a9f0> <<< 49116 1727204678.33541: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7b07380> <<< 49116 1727204678.33703: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7b0b800> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204678.33717: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204678.33776: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7b0ba40> <<< 49116 1727204678.33900: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7b0bbc0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7afb3e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 49116 1727204678.34144: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 49116 1727204678.34195: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7b0f2f0> <<< 49116 1727204678.34236: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204678.34310: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7b10800> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7b0da90> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7b0ee10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7b0d730> <<< 49116 1727204678.34436: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available <<< 49116 1727204678.34549: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.34571: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 49116 1727204678.34594: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 49116 1727204678.34734: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.34875: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.35688: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.36169: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 49116 1727204678.36196: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 49116 1727204678.36256: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204678.36372: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7998980> <<< 49116 1727204678.36403: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 49116 1727204678.36459: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7999820> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7b0dbb0> <<< 49116 1727204678.36486: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 49116 1727204678.36533: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.36537: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 49116 1727204678.36539: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.36770: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.36891: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 49116 1727204678.36917: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7999700> # zipimport: zlib available <<< 49116 1727204678.37475: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.37986: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.38081: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.38238: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # <<< 49116 1727204678.38252: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.38456: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 49116 1727204678.38562: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # <<< 49116 1727204678.38568: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.38826: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.39095: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 49116 1727204678.39162: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 49116 1727204678.39187: stdout chunk (state=3): >>>import '_ast' # <<< 49116 1727204678.39275: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b799a660> <<< 49116 1727204678.39279: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.39383: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.39508: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 49116 1727204678.39580: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204678.39714: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b79a20f0> <<< 49116 1727204678.39852: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b79a2a50> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7b0be60> # zipimport: zlib available # zipimport: zlib available <<< 49116 1727204678.39904: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 49116 1727204678.39915: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.39981: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.40009: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.40147: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 49116 1727204678.40194: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204678.40399: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b79a1880> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b79a2b70> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 49116 1727204678.40475: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.40544: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.40617: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.40642: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 49116 1727204678.40690: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 49116 1727204678.40945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7a3adb0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b79afb00> <<< 49116 1727204678.41014: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b79aac60> <<< 49116 1727204678.41033: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b79aaab0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 49116 1727204678.41059: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.41088: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 49116 1727204678.41183: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 49116 1727204678.41216: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 49116 1727204678.41277: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.41344: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.41386: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 49116 1727204678.41438: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.41485: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.41520: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.41584: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 49116 1727204678.41683: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.41706: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.41742: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.41764: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.41821: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 49116 1727204678.42035: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.42222: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.42270: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.42331: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204678.42467: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 49116 1727204678.42496: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 49116 1727204678.42511: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7a41c10> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 49116 1727204678.42563: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 49116 1727204678.42605: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 49116 1727204678.42618: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6f4c380> <<< 49116 1727204678.42656: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204678.42669: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b6f4c6b0> <<< 49116 1727204678.42729: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7a21400> <<< 49116 1727204678.42751: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7a20380> <<< 49116 1727204678.42796: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7a402f0> <<< 49116 1727204678.42878: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7a40d40> <<< 49116 1727204678.42881: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 49116 1727204678.42918: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 49116 1727204678.42976: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204678.43042: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b6f4f650> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6f4ef00> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b6f4f0e0> <<< 49116 1727204678.43054: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6f4e330> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 49116 1727204678.43297: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6f4f7a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b6fb62d0> <<< 49116 1727204678.43317: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6fb42f0> <<< 49116 1727204678.43353: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7a40050> import 'ansible.module_utils.facts.timeout' # <<< 49116 1727204678.43376: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 49116 1727204678.43412: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 49116 1727204678.43430: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.43501: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.43554: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 49116 1727204678.43680: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 49116 1727204678.43726: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available <<< 49116 1727204678.43741: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 49116 1727204678.43744: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.43763: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.43806: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 49116 1727204678.43864: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.43931: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 49116 1727204678.43967: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.43978: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.44061: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 49116 1727204678.44087: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.44164: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.44292: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 49116 1727204678.44311: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.44889: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.45429: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 49116 1727204678.45432: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.45589: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.45619: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 49116 1727204678.45632: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.45661: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.45708: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 49116 1727204678.45711: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.45764: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.45837: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 49116 1727204678.45840: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.45874: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.45921: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 49116 1727204678.45924: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.45946: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.45992: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 49116 1727204678.46003: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.46111: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.46180: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 49116 1727204678.46245: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 49116 1727204678.46263: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6fb64e0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 49116 1727204678.46280: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 49116 1727204678.46436: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6fb7110> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 49116 1727204678.46496: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.46685: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 49116 1727204678.46688: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.46782: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 49116 1727204678.46805: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.46859: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.46950: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 49116 1727204678.47014: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 49116 1727204678.47053: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 49116 1727204678.47216: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 49116 1727204678.47235: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204678.47269: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b6fe65a0> <<< 49116 1727204678.47489: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6fd0950> import 'ansible.module_utils.facts.system.python' # <<< 49116 1727204678.47543: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.47570: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.47627: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 49116 1727204678.47674: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.47753: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.47876: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.47953: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.48219: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 49116 1727204678.48223: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 49116 1727204678.48235: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 49116 1727204678.48273: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.48331: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 49116 1727204678.48413: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b6e01e20> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6e019d0> <<< 49116 1727204678.48481: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 49116 1727204678.48496: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 49116 1727204678.48552: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 49116 1727204678.48752: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.48756: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.48904: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 49116 1727204678.48920: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.49025: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.49136: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.49187: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.49228: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 49116 1727204678.49248: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 49116 1727204678.49271: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.49298: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.49453: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.49634: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 49116 1727204678.49646: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.49763: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.49961: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 49116 1727204678.49984: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 49116 1727204678.50664: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.51280: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 49116 1727204678.51311: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.51428: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.51704: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 49116 1727204678.51708: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.51759: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 49116 1727204678.51776: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.51943: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.52115: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 49116 1727204678.52260: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.52290: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 49116 1727204678.52377: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.52494: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.52731: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.52959: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 49116 1727204678.52982: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.53026: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.53078: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 49116 1727204678.53178: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 49116 1727204678.53238: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.53292: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 49116 1727204678.53356: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.53373: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 49116 1727204678.53583: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 49116 1727204678.53623: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 49116 1727204678.53638: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.53940: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.54239: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 49116 1727204678.54254: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.54320: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.54392: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 49116 1727204678.54395: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.54433: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.54463: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 49116 1727204678.54483: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.54510: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.54556: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 49116 1727204678.54569: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.54592: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.54624: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 49116 1727204678.54649: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.54729: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.54859: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 49116 1727204678.54873: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 49116 1727204678.55019: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 49116 1727204678.55073: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.55127: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.55200: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.55277: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 49116 1727204678.55300: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 49116 1727204678.55341: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.55386: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.55433: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 49116 1727204678.55540: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.55668: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.55893: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 49116 1727204678.55949: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.56005: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 49116 1727204678.56018: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.56202: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 49116 1727204678.56214: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.56304: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 49116 1727204678.56325: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.56420: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.56521: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 49116 1727204678.56610: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204678.57187: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b6e2a9f0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6e2a3f0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6e24c80> <<< 49116 1727204678.75294: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 49116 1727204678.75330: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 49116 1727204678.75367: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6e71280> <<< 49116 1727204678.75387: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 49116 1727204678.75399: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 49116 1727204678.75452: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6e72030> <<< 49116 1727204678.75519: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204678.75600: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 49116 1727204678.75628: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6fdc710> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6fdcbc0> <<< 49116 1727204678.76078: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame<<< 49116 1727204678.76093: stdout chunk (state=3): >>> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 49116 1727204678.96452: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "04", "second": "38", "epoch": "1727204678", "epoch_int": "1727204678", "date": "2024-09-24", "time": "15:04:38", "iso8601_micro": "2024-09-24T19:04:38.572111Z", "iso8601": "2024-09-24T19:04:38Z", "iso8601_basic": "20240924T150438572111", "iso8601_basic_short": "20240924T150438", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.5380859375, "5m": 0.60302734375, "15m": 0.4326171875}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.or<<< 49116 1727204678.96466: stdout chunk (state=3): >>>g/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope":<<< 49116 1727204678.96497: stdout chunk (state=3): >>> "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169"], "ansible_all_ipv6_addresses": ["fe80::aa:78ff:fea8:9b13"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::aa:78ff:fea8:9b13"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3029, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 687, "free": 3029}, "nocache": {"free": 3482, "used": 234}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1016, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251301064704, "block_size": 4096, "block_total": 64479564, "block_available": 61352799, "block_used": 3126765, "inode_total": 16384000, "inode_available": 16301237, "inode_used": 82763, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 49116 1727204678.97509: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs<<< 49116 1727204678.97546: stdout chunk (state=3): >>> # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator<<< 49116 1727204678.97581: stdout chunk (state=3): >>> # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler <<< 49116 1727204678.97617: stdout chunk (state=3): >>># cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util<<< 49116 1727204678.97627: stdout chunk (state=3): >>> # cleanup[2] removing runpy # destroy runpy<<< 49116 1727204678.97639: stdout chunk (state=3): >>> # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil<<< 49116 1727204678.97659: stdout chunk (state=3): >>> # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset<<< 49116 1727204678.97689: stdout chunk (state=3): >>> # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib<<< 49116 1727204678.97711: stdout chunk (state=3): >>> # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil<<< 49116 1727204678.97735: stdout chunk (state=3): >>> # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner<<< 49116 1727204678.97759: stdout chunk (state=3): >>> # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors<<< 49116 1727204678.97790: stdout chunk (state=3): >>> # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog<<< 49116 1727204678.97814: stdout chunk (state=3): >>> # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string<<< 49116 1727204678.97846: stdout chunk (state=3): >>> # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon<<< 49116 1727204678.97853: stdout chunk (state=3): >>> # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text<<< 49116 1727204678.97875: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes<<< 49116 1727204678.97905: stdout chunk (state=3): >>> # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections<<< 49116 1727204678.97927: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast<<< 49116 1727204678.97950: stdout chunk (state=3): >>> # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec<<< 49116 1727204678.97976: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux <<< 49116 1727204678.98019: stdout chunk (state=3): >>># cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro <<< 49116 1727204678.98027: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic<<< 49116 1727204678.98052: stdout chunk (state=3): >>> # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq<<< 49116 1727204678.98133: stdout chunk (state=3): >>> # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin <<< 49116 1727204678.98145: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd <<< 49116 1727204678.98168: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base<<< 49116 1727204678.98186: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd<<< 49116 1727204678.98208: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace<<< 49116 1727204678.98221: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot<<< 49116 1727204678.98247: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb<<< 49116 1727204678.98271: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base <<< 49116 1727204678.98299: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl<<< 49116 1727204678.98320: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux<<< 49116 1727204678.98347: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd <<< 49116 1727204678.98370: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection <<< 49116 1727204678.98402: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.dummy <<< 49116 1727204678.99150: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery<<< 49116 1727204678.99157: stdout chunk (state=3): >>> # destroy importlib._abc<<< 49116 1727204678.99188: stdout chunk (state=3): >>> # destroy importlib.util # destroy _bz2<<< 49116 1727204678.99200: stdout chunk (state=3): >>> <<< 49116 1727204678.99210: stdout chunk (state=3): >>># destroy _compression<<< 49116 1727204678.99236: stdout chunk (state=3): >>> # destroy _lzma # destroy binascii <<< 49116 1727204678.99257: stdout chunk (state=3): >>># destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path<<< 49116 1727204678.99285: stdout chunk (state=3): >>> # destroy zipfile<<< 49116 1727204678.99293: stdout chunk (state=3): >>> <<< 49116 1727204678.99304: stdout chunk (state=3): >>># destroy pathlib # destroy zipfile._path.glob<<< 49116 1727204678.99362: stdout chunk (state=3): >>> # destroy ipaddress # destroy ntpath <<< 49116 1727204678.99390: stdout chunk (state=3): >>># destroy importlib <<< 49116 1727204678.99411: stdout chunk (state=3): >>># destroy zipimport <<< 49116 1727204678.99424: stdout chunk (state=3): >>># destroy __main__<<< 49116 1727204678.99450: stdout chunk (state=3): >>> # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder<<< 49116 1727204678.99453: stdout chunk (state=3): >>> # destroy json.encoder<<< 49116 1727204678.99604: stdout chunk (state=3): >>> # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 49116 1727204678.99682: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors <<< 49116 1727204678.99686: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.ansible_collector<<< 49116 1727204678.99698: stdout chunk (state=3): >>> <<< 49116 1727204678.99718: stdout chunk (state=3): >>># destroy multiprocessing <<< 49116 1727204678.99729: stdout chunk (state=3): >>># destroy multiprocessing.queues <<< 49116 1727204678.99784: stdout chunk (state=3): >>># destroy multiprocessing.synchronize # destroy multiprocessing.dummy<<< 49116 1727204678.99788: stdout chunk (state=3): >>> # destroy multiprocessing.pool<<< 49116 1727204678.99794: stdout chunk (state=3): >>> # destroy signal<<< 49116 1727204678.99797: stdout chunk (state=3): >>> <<< 49116 1727204678.99817: stdout chunk (state=3): >>># destroy pickle<<< 49116 1727204678.99829: stdout chunk (state=3): >>> # destroy _compat_pickle<<< 49116 1727204678.99837: stdout chunk (state=3): >>> <<< 49116 1727204678.99877: stdout chunk (state=3): >>># destroy _pickle # destroy queue<<< 49116 1727204678.99899: stdout chunk (state=3): >>> # destroy _heapq # destroy _queue<<< 49116 1727204678.99927: stdout chunk (state=3): >>> # destroy multiprocessing.reduction<<< 49116 1727204678.99935: stdout chunk (state=3): >>> # destroy selectors<<< 49116 1727204678.99983: stdout chunk (state=3): >>> # destroy shlex <<< 49116 1727204678.99987: stdout chunk (state=3): >>># destroy fcntl<<< 49116 1727204678.99994: stdout chunk (state=3): >>> <<< 49116 1727204679.00027: stdout chunk (state=3): >>># destroy datetime # destroy subprocess <<< 49116 1727204679.00074: stdout chunk (state=3): >>># destroy base64 # destroy _ssl <<< 49116 1727204679.00114: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux<<< 49116 1727204679.00131: stdout chunk (state=3): >>> # destroy getpass<<< 49116 1727204679.00148: stdout chunk (state=3): >>> # destroy pwd # destroy termios<<< 49116 1727204679.00166: stdout chunk (state=3): >>> # destroy json<<< 49116 1727204679.00172: stdout chunk (state=3): >>> <<< 49116 1727204679.00228: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob<<< 49116 1727204679.00238: stdout chunk (state=3): >>> <<< 49116 1727204679.00252: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata<<< 49116 1727204679.00274: stdout chunk (state=3): >>> # destroy errno # destroy multiprocessing.connection # destroy tempfile<<< 49116 1727204679.00297: stdout chunk (state=3): >>> # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array<<< 49116 1727204679.00388: stdout chunk (state=3): >>> # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna <<< 49116 1727204679.00416: stdout chunk (state=3): >>># destroy stringprep <<< 49116 1727204679.00430: stdout chunk (state=3): >>># cleanup[3] wiping configparser <<< 49116 1727204679.00452: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 49116 1727204679.00478: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian <<< 49116 1727204679.00489: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes<<< 49116 1727204679.00496: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc<<< 49116 1727204679.00526: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket<<< 49116 1727204679.00537: stdout chunk (state=3): >>> # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader<<< 49116 1727204679.00560: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback<<< 49116 1727204679.00571: stdout chunk (state=3): >>> # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize<<< 49116 1727204679.00590: stdout chunk (state=3): >>> # cleanup[3] wiping _tokenize<<< 49116 1727204679.00618: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit<<< 49116 1727204679.00626: stdout chunk (state=3): >>> <<< 49116 1727204679.00639: stdout chunk (state=3): >>># cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib<<< 49116 1727204679.00656: stdout chunk (state=3): >>> # cleanup[3] wiping threading # cleanup[3] wiping weakref<<< 49116 1727204679.00669: stdout chunk (state=3): >>> # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect<<< 49116 1727204679.00689: stdout chunk (state=3): >>> # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external<<< 49116 1727204679.00709: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct<<< 49116 1727204679.00737: stdout chunk (state=3): >>> # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum<<< 49116 1727204679.00756: stdout chunk (state=3): >>> # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre<<< 49116 1727204679.00778: stdout chunk (state=3): >>> # cleanup[3] wiping functools # cleanup[3] wiping _functools<<< 49116 1727204679.00783: stdout chunk (state=3): >>> # cleanup[3] wiping collections<<< 49116 1727204679.00805: stdout chunk (state=3): >>> # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator<<< 49116 1727204679.00841: stdout chunk (state=3): >>> # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath<<< 49116 1727204679.00850: stdout chunk (state=3): >>> # cleanup[3] wiping genericpath<<< 49116 1727204679.00875: stdout chunk (state=3): >>> # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc<<< 49116 1727204679.00888: stdout chunk (state=3): >>> # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs<<< 49116 1727204679.00914: stdout chunk (state=3): >>> # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal<<< 49116 1727204679.00920: stdout chunk (state=3): >>> # cleanup[3] wiping _io # cleanup[3] wiping _weakref<<< 49116 1727204679.00944: stdout chunk (state=3): >>> # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys<<< 49116 1727204679.00970: stdout chunk (state=3): >>> # cleanup[3] wiping builtins<<< 49116 1727204679.00977: stdout chunk (state=3): >>> <<< 49116 1727204679.00998: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime<<< 49116 1727204679.01101: stdout chunk (state=3): >>> <<< 49116 1727204679.01347: stdout chunk (state=3): >>># destroy sys.monitoring <<< 49116 1727204679.01366: stdout chunk (state=3): >>># destroy _socket<<< 49116 1727204679.01372: stdout chunk (state=3): >>> <<< 49116 1727204679.01416: stdout chunk (state=3): >>># destroy _collections<<< 49116 1727204679.01425: stdout chunk (state=3): >>> <<< 49116 1727204679.01457: stdout chunk (state=3): >>># destroy platform<<< 49116 1727204679.01474: stdout chunk (state=3): >>> <<< 49116 1727204679.01477: stdout chunk (state=3): >>># destroy _uuid <<< 49116 1727204679.01496: stdout chunk (state=3): >>># destroy stat <<< 49116 1727204679.01523: stdout chunk (state=3): >>># destroy genericpath # destroy re._parser # destroy tokenize<<< 49116 1727204679.01531: stdout chunk (state=3): >>> <<< 49116 1727204679.01572: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg<<< 49116 1727204679.01601: stdout chunk (state=3): >>> # destroy contextlib <<< 49116 1727204679.01628: stdout chunk (state=3): >>># destroy _typing<<< 49116 1727204679.01649: stdout chunk (state=3): >>> # destroy _tokenize<<< 49116 1727204679.01654: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error<<< 49116 1727204679.01708: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external <<< 49116 1727204679.01715: stdout chunk (state=3): >>># destroy _imp<<< 49116 1727204679.01770: stdout chunk (state=3): >>> # destroy _io # destroy marshal # clear sys.meta_path<<< 49116 1727204679.01786: stdout chunk (state=3): >>> <<< 49116 1727204679.01798: stdout chunk (state=3): >>># clear sys.modules # destroy _frozen_importlib<<< 49116 1727204679.01805: stdout chunk (state=3): >>> <<< 49116 1727204679.01941: stdout chunk (state=3): >>># destroy codecs<<< 49116 1727204679.01968: stdout chunk (state=3): >>> # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io<<< 49116 1727204679.01981: stdout chunk (state=3): >>> # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit<<< 49116 1727204679.01996: stdout chunk (state=3): >>> # destroy _warnings # destroy math # destroy _bisect<<< 49116 1727204679.02018: stdout chunk (state=3): >>> # destroy time<<< 49116 1727204679.02063: stdout chunk (state=3): >>> # destroy _random <<< 49116 1727204679.02081: stdout chunk (state=3): >>># destroy _weakref <<< 49116 1727204679.02133: stdout chunk (state=3): >>># destroy _operator<<< 49116 1727204679.02143: stdout chunk (state=3): >>> # destroy _sha2 # destroy _sre # destroy _string<<< 49116 1727204679.02177: stdout chunk (state=3): >>> # destroy re # destroy itertools<<< 49116 1727204679.02181: stdout chunk (state=3): >>> <<< 49116 1727204679.02205: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools<<< 49116 1727204679.02245: stdout chunk (state=3): >>> # destroy builtins # destroy _thread # clear sys.audit hooks<<< 49116 1727204679.02251: stdout chunk (state=3): >>> <<< 49116 1727204679.02891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204679.02960: stderr chunk (state=3): >>><<< 49116 1727204679.02964: stdout chunk (state=3): >>><<< 49116 1727204679.03080: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b8118530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b80e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b811aab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7ecd190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7ece090> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f0be60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f0bf20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f43830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f43ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f23b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f21250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f09010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f67800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f66420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f22120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f64c50> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f98890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f082c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7f98d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f98bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7f98fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f06de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f99670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f99340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f9a570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7fb47a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7fb5ee0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7fb6d80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7fb73e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7fb62d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7fb7e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7fb7560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f9a5d0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7cf3da0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7d1c7d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7d1c530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7d1c800> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7d1c9e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7cf1f40> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7d1dfd0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7d1cc80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7f9acc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7d4a360> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7d62510> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7d9b260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7dc5a00> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7d9b380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7d631a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7bb0380> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7d61550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7d1ef00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f63b7d61670> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_lsdwyrym/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7c16000> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7becef0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7bb3f80> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7befe90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7c49a90> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7c49820> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7c49130> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7c49b80> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7c16c90> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7c4a7e0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7c4aa20> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7c4af30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7aaccb0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7aae8d0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7aaf1a0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7ab0350> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7ab2e10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7ab2f30> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7ab10d0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7ab6db0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7ab5880> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7ab55e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7ab7d70> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7ab15e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7afaf30> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7afb0e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7b04ce0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7b04aa0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7b07200> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7b053d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7b0a9f0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7b07380> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7b0b800> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7b0ba40> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7b0bbc0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7afb3e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7b0f2f0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7b10800> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7b0da90> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7b0ee10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7b0d730> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b7998980> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7999820> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7b0dbb0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7999700> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b799a660> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b79a20f0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b79a2a50> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7b0be60> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b79a1880> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b79a2b70> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7a3adb0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b79afb00> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b79aac60> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b79aaab0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7a41c10> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6f4c380> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b6f4c6b0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7a21400> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7a20380> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7a402f0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7a40d40> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b6f4f650> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6f4ef00> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b6f4f0e0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6f4e330> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6f4f7a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b6fb62d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6fb42f0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b7a40050> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6fb64e0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6fb7110> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b6fe65a0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6fd0950> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b6e01e20> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6e019d0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f63b6e2a9f0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6e2a3f0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6e24c80> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6e71280> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6e72030> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6fdc710> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f63b6fdcbc0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "04", "second": "38", "epoch": "1727204678", "epoch_int": "1727204678", "date": "2024-09-24", "time": "15:04:38", "iso8601_micro": "2024-09-24T19:04:38.572111Z", "iso8601": "2024-09-24T19:04:38Z", "iso8601_basic": "20240924T150438572111", "iso8601_basic_short": "20240924T150438", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.5380859375, "5m": 0.60302734375, "15m": 0.4326171875}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169"], "ansible_all_ipv6_addresses": ["fe80::aa:78ff:fea8:9b13"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::aa:78ff:fea8:9b13"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3029, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 687, "free": 3029}, "nocache": {"free": 3482, "used": 234}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1016, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251301064704, "block_size": 4096, "block_total": 64479564, "block_available": 61352799, "block_used": 3126765, "inode_total": 16384000, "inode_available": 16301237, "inode_used": 82763, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 49116 1727204679.03977: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204677.0732348-49236-70937275087786/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204679.03980: _low_level_execute_command(): starting 49116 1727204679.03982: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204677.0732348-49236-70937275087786/ > /dev/null 2>&1 && sleep 0' 49116 1727204679.04228: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204679.04232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204679.04235: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204679.04271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204679.04307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204679.04310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204679.04313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204679.04397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204679.07255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204679.07259: stderr chunk (state=3): >>><<< 49116 1727204679.07273: stdout chunk (state=3): >>><<< 49116 1727204679.07411: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204679.07415: handler run complete 49116 1727204679.07458: variable 'ansible_facts' from source: unknown 49116 1727204679.07584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204679.08795: variable 'ansible_facts' from source: unknown 49116 1727204679.08868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204679.08960: attempt loop complete, returning result 49116 1727204679.08963: _execute() done 49116 1727204679.08966: dumping result to json 49116 1727204679.08990: done dumping result, returning 49116 1727204679.08999: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [127b8e07-fff9-02f7-957b-0000000000af] 49116 1727204679.09002: sending task result for task 127b8e07-fff9-02f7-957b-0000000000af 49116 1727204679.09290: done sending task result for task 127b8e07-fff9-02f7-957b-0000000000af 49116 1727204679.09293: WORKER PROCESS EXITING ok: [managed-node3] 49116 1727204679.09590: no more pending results, returning what we have 49116 1727204679.09592: results queue empty 49116 1727204679.09593: checking for any_errors_fatal 49116 1727204679.09594: done checking for any_errors_fatal 49116 1727204679.09594: checking for max_fail_percentage 49116 1727204679.09595: done checking for max_fail_percentage 49116 1727204679.09596: checking to see if all hosts have failed and the running result is not ok 49116 1727204679.09596: done checking to see if all hosts have failed 49116 1727204679.09597: getting the remaining hosts for this loop 49116 1727204679.09598: done getting the remaining hosts for this loop 49116 1727204679.09601: getting the next task for host managed-node3 49116 1727204679.09605: done getting next task for host managed-node3 49116 1727204679.09607: ^ task is: TASK: meta (flush_handlers) 49116 1727204679.09608: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204679.09611: getting variables 49116 1727204679.09612: in VariableManager get_vars() 49116 1727204679.09633: Calling all_inventory to load vars for managed-node3 49116 1727204679.09635: Calling groups_inventory to load vars for managed-node3 49116 1727204679.09637: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204679.09646: Calling all_plugins_play to load vars for managed-node3 49116 1727204679.09648: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204679.09650: Calling groups_plugins_play to load vars for managed-node3 49116 1727204679.09784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204679.10036: done with get_vars() 49116 1727204679.10052: done getting variables 49116 1727204679.10112: in VariableManager get_vars() 49116 1727204679.10123: Calling all_inventory to load vars for managed-node3 49116 1727204679.10125: Calling groups_inventory to load vars for managed-node3 49116 1727204679.10127: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204679.10139: Calling all_plugins_play to load vars for managed-node3 49116 1727204679.10141: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204679.10143: Calling groups_plugins_play to load vars for managed-node3 49116 1727204679.10263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204679.10427: done with get_vars() 49116 1727204679.10443: done queuing things up, now waiting for results queue to drain 49116 1727204679.10446: results queue empty 49116 1727204679.10447: checking for any_errors_fatal 49116 1727204679.10449: done checking for any_errors_fatal 49116 1727204679.10450: checking for max_fail_percentage 49116 1727204679.10459: done checking for max_fail_percentage 49116 1727204679.10459: checking to see if all hosts have failed and the running result is not ok 49116 1727204679.10460: done checking to see if all hosts have failed 49116 1727204679.10460: getting the remaining hosts for this loop 49116 1727204679.10461: done getting the remaining hosts for this loop 49116 1727204679.10463: getting the next task for host managed-node3 49116 1727204679.10469: done getting next task for host managed-node3 49116 1727204679.10471: ^ task is: TASK: Include the task 'el_repo_setup.yml' 49116 1727204679.10472: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204679.10473: getting variables 49116 1727204679.10474: in VariableManager get_vars() 49116 1727204679.10481: Calling all_inventory to load vars for managed-node3 49116 1727204679.10482: Calling groups_inventory to load vars for managed-node3 49116 1727204679.10484: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204679.10488: Calling all_plugins_play to load vars for managed-node3 49116 1727204679.10489: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204679.10491: Calling groups_plugins_play to load vars for managed-node3 49116 1727204679.10615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204679.10843: done with get_vars() 49116 1727204679.10852: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:11 Tuesday 24 September 2024 15:04:39 -0400 (0:00:02.099) 0:00:02.134 ***** 49116 1727204679.10942: entering _queue_task() for managed-node3/include_tasks 49116 1727204679.10944: Creating lock for include_tasks 49116 1727204679.11309: worker is 1 (out of 1 available) 49116 1727204679.11322: exiting _queue_task() for managed-node3/include_tasks 49116 1727204679.11337: done queuing things up, now waiting for results queue to drain 49116 1727204679.11339: waiting for pending results... 49116 1727204679.11648: running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' 49116 1727204679.11716: in run() - task 127b8e07-fff9-02f7-957b-000000000006 49116 1727204679.11729: variable 'ansible_search_path' from source: unknown 49116 1727204679.11780: calling self._execute() 49116 1727204679.11832: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204679.11841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204679.11850: variable 'omit' from source: magic vars 49116 1727204679.11945: _execute() done 49116 1727204679.11950: dumping result to json 49116 1727204679.11953: done dumping result, returning 49116 1727204679.11959: done running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' [127b8e07-fff9-02f7-957b-000000000006] 49116 1727204679.11968: sending task result for task 127b8e07-fff9-02f7-957b-000000000006 49116 1727204679.12072: done sending task result for task 127b8e07-fff9-02f7-957b-000000000006 49116 1727204679.12075: WORKER PROCESS EXITING 49116 1727204679.12121: no more pending results, returning what we have 49116 1727204679.12126: in VariableManager get_vars() 49116 1727204679.12163: Calling all_inventory to load vars for managed-node3 49116 1727204679.12168: Calling groups_inventory to load vars for managed-node3 49116 1727204679.12172: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204679.12184: Calling all_plugins_play to load vars for managed-node3 49116 1727204679.12186: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204679.12189: Calling groups_plugins_play to load vars for managed-node3 49116 1727204679.12347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204679.12487: done with get_vars() 49116 1727204679.12496: variable 'ansible_search_path' from source: unknown 49116 1727204679.12509: we have included files to process 49116 1727204679.12510: generating all_blocks data 49116 1727204679.12511: done generating all_blocks data 49116 1727204679.12511: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 49116 1727204679.12513: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 49116 1727204679.12514: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 49116 1727204679.13030: in VariableManager get_vars() 49116 1727204679.13047: done with get_vars() 49116 1727204679.13056: done processing included file 49116 1727204679.13057: iterating over new_blocks loaded from include file 49116 1727204679.13059: in VariableManager get_vars() 49116 1727204679.13066: done with get_vars() 49116 1727204679.13068: filtering new block on tags 49116 1727204679.13079: done filtering new block on tags 49116 1727204679.13081: in VariableManager get_vars() 49116 1727204679.13087: done with get_vars() 49116 1727204679.13088: filtering new block on tags 49116 1727204679.13098: done filtering new block on tags 49116 1727204679.13100: in VariableManager get_vars() 49116 1727204679.13106: done with get_vars() 49116 1727204679.13107: filtering new block on tags 49116 1727204679.13115: done filtering new block on tags 49116 1727204679.13116: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node3 49116 1727204679.13121: extending task lists for all hosts with included blocks 49116 1727204679.13160: done extending task lists 49116 1727204679.13161: done processing included files 49116 1727204679.13161: results queue empty 49116 1727204679.13162: checking for any_errors_fatal 49116 1727204679.13163: done checking for any_errors_fatal 49116 1727204679.13163: checking for max_fail_percentage 49116 1727204679.13164: done checking for max_fail_percentage 49116 1727204679.13166: checking to see if all hosts have failed and the running result is not ok 49116 1727204679.13167: done checking to see if all hosts have failed 49116 1727204679.13167: getting the remaining hosts for this loop 49116 1727204679.13168: done getting the remaining hosts for this loop 49116 1727204679.13170: getting the next task for host managed-node3 49116 1727204679.13173: done getting next task for host managed-node3 49116 1727204679.13174: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 49116 1727204679.13175: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204679.13177: getting variables 49116 1727204679.13177: in VariableManager get_vars() 49116 1727204679.13184: Calling all_inventory to load vars for managed-node3 49116 1727204679.13186: Calling groups_inventory to load vars for managed-node3 49116 1727204679.13187: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204679.13192: Calling all_plugins_play to load vars for managed-node3 49116 1727204679.13194: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204679.13195: Calling groups_plugins_play to load vars for managed-node3 49116 1727204679.13314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204679.13452: done with get_vars() 49116 1727204679.13459: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 15:04:39 -0400 (0:00:00.025) 0:00:02.160 ***** 49116 1727204679.13515: entering _queue_task() for managed-node3/setup 49116 1727204679.13815: worker is 1 (out of 1 available) 49116 1727204679.13828: exiting _queue_task() for managed-node3/setup 49116 1727204679.13843: done queuing things up, now waiting for results queue to drain 49116 1727204679.13845: waiting for pending results... 49116 1727204679.14190: running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 49116 1727204679.14223: in run() - task 127b8e07-fff9-02f7-957b-0000000000c0 49116 1727204679.14275: variable 'ansible_search_path' from source: unknown 49116 1727204679.14283: variable 'ansible_search_path' from source: unknown 49116 1727204679.14287: calling self._execute() 49116 1727204679.14372: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204679.14378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204679.14391: variable 'omit' from source: magic vars 49116 1727204679.14978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204679.17460: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204679.17568: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204679.17589: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204679.17648: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204679.17685: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204679.17793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204679.17892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204679.17896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204679.17915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204679.17941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204679.18148: variable 'ansible_facts' from source: unknown 49116 1727204679.18241: variable 'network_test_required_facts' from source: task vars 49116 1727204679.18288: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 49116 1727204679.18299: variable 'omit' from source: magic vars 49116 1727204679.18353: variable 'omit' from source: magic vars 49116 1727204679.18396: variable 'omit' from source: magic vars 49116 1727204679.18435: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204679.18541: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204679.18545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204679.18547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204679.18550: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204679.18577: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204679.18587: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204679.18596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204679.18712: Set connection var ansible_connection to ssh 49116 1727204679.18735: Set connection var ansible_timeout to 10 49116 1727204679.18750: Set connection var ansible_shell_executable to /bin/sh 49116 1727204679.18764: Set connection var ansible_pipelining to False 49116 1727204679.18774: Set connection var ansible_shell_type to sh 49116 1727204679.18784: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204679.18816: variable 'ansible_shell_executable' from source: unknown 49116 1727204679.18866: variable 'ansible_connection' from source: unknown 49116 1727204679.18870: variable 'ansible_module_compression' from source: unknown 49116 1727204679.18873: variable 'ansible_shell_type' from source: unknown 49116 1727204679.18875: variable 'ansible_shell_executable' from source: unknown 49116 1727204679.18877: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204679.18879: variable 'ansible_pipelining' from source: unknown 49116 1727204679.18882: variable 'ansible_timeout' from source: unknown 49116 1727204679.18884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204679.19043: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49116 1727204679.19061: variable 'omit' from source: magic vars 49116 1727204679.19074: starting attempt loop 49116 1727204679.19191: running the handler 49116 1727204679.19195: _low_level_execute_command(): starting 49116 1727204679.19198: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204679.19984: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204679.20085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204679.20097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204679.20119: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204679.20144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204679.20245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204679.22121: stdout chunk (state=3): >>>/root <<< 49116 1727204679.22320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204679.22349: stdout chunk (state=3): >>><<< 49116 1727204679.22362: stderr chunk (state=3): >>><<< 49116 1727204679.22497: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204679.22509: _low_level_execute_command(): starting 49116 1727204679.22512: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204679.2239277-49314-214511918793933 `" && echo ansible-tmp-1727204679.2239277-49314-214511918793933="` echo /root/.ansible/tmp/ansible-tmp-1727204679.2239277-49314-214511918793933 `" ) && sleep 0' 49116 1727204679.23193: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204679.23315: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204679.23370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204679.23442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204679.25649: stdout chunk (state=3): >>>ansible-tmp-1727204679.2239277-49314-214511918793933=/root/.ansible/tmp/ansible-tmp-1727204679.2239277-49314-214511918793933 <<< 49116 1727204679.25881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204679.25885: stdout chunk (state=3): >>><<< 49116 1727204679.25888: stderr chunk (state=3): >>><<< 49116 1727204679.25908: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204679.2239277-49314-214511918793933=/root/.ansible/tmp/ansible-tmp-1727204679.2239277-49314-214511918793933 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204679.25994: variable 'ansible_module_compression' from source: unknown 49116 1727204679.26052: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 49116 1727204679.26140: variable 'ansible_facts' from source: unknown 49116 1727204679.26375: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204679.2239277-49314-214511918793933/AnsiballZ_setup.py 49116 1727204679.26571: Sending initial data 49116 1727204679.26574: Sent initial data (154 bytes) 49116 1727204679.27460: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204679.27480: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204679.27541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204679.27618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204679.29427: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204679.29497: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204679.29583: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpgygk4j65 /root/.ansible/tmp/ansible-tmp-1727204679.2239277-49314-214511918793933/AnsiballZ_setup.py <<< 49116 1727204679.29587: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204679.2239277-49314-214511918793933/AnsiballZ_setup.py" <<< 49116 1727204679.29647: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpgygk4j65" to remote "/root/.ansible/tmp/ansible-tmp-1727204679.2239277-49314-214511918793933/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204679.2239277-49314-214511918793933/AnsiballZ_setup.py" <<< 49116 1727204679.31237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204679.31374: stderr chunk (state=3): >>><<< 49116 1727204679.31378: stdout chunk (state=3): >>><<< 49116 1727204679.31381: done transferring module to remote 49116 1727204679.31383: _low_level_execute_command(): starting 49116 1727204679.31386: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204679.2239277-49314-214511918793933/ /root/.ansible/tmp/ansible-tmp-1727204679.2239277-49314-214511918793933/AnsiballZ_setup.py && sleep 0' 49116 1727204679.31888: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204679.31893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204679.31900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204679.31902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204679.31968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204679.31974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204679.31976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204679.32046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204679.34389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204679.34472: stderr chunk (state=3): >>><<< 49116 1727204679.34478: stdout chunk (state=3): >>><<< 49116 1727204679.34487: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204679.34490: _low_level_execute_command(): starting 49116 1727204679.34496: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204679.2239277-49314-214511918793933/AnsiballZ_setup.py && sleep 0' 49116 1727204679.35004: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204679.35008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204679.35011: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204679.35013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204679.35078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204679.35081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204679.35162: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 49116 1727204679.38433: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # <<< 49116 1727204679.38471: stdout chunk (state=3): >>>import 'posix' # <<< 49116 1727204679.38529: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 49116 1727204679.38563: stdout chunk (state=3): >>>import 'time' # <<< 49116 1727204679.38568: stdout chunk (state=3): >>>import 'zipimport' # <<< 49116 1727204679.38570: stdout chunk (state=3): >>># installed zipimport hook <<< 49116 1727204679.38649: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 49116 1727204679.38652: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204679.38693: stdout chunk (state=3): >>>import '_codecs' # <<< 49116 1727204679.38722: stdout chunk (state=3): >>>import 'codecs' # <<< 49116 1727204679.38813: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc'<<< 49116 1727204679.38822: stdout chunk (state=3): >>> <<< 49116 1727204679.38840: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034bfc530> <<< 49116 1727204679.38894: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034bcbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py<<< 49116 1727204679.38900: stdout chunk (state=3): >>> <<< 49116 1727204679.38935: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034bfeab0><<< 49116 1727204679.38938: stdout chunk (state=3): >>> <<< 49116 1727204679.38967: stdout chunk (state=3): >>>import '_signal' # <<< 49116 1727204679.39010: stdout chunk (state=3): >>> import '_abc' # <<< 49116 1727204679.39016: stdout chunk (state=3): >>> <<< 49116 1727204679.39036: stdout chunk (state=3): >>>import 'abc' # <<< 49116 1727204679.39073: stdout chunk (state=3): >>>import 'io' # <<< 49116 1727204679.39132: stdout chunk (state=3): >>> import '_stat' # <<< 49116 1727204679.39144: stdout chunk (state=3): >>> <<< 49116 1727204679.39154: stdout chunk (state=3): >>>import 'stat' # <<< 49116 1727204679.39310: stdout chunk (state=3): >>>import '_collections_abc' # <<< 49116 1727204679.39316: stdout chunk (state=3): >>> <<< 49116 1727204679.39361: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 49116 1727204679.39405: stdout chunk (state=3): >>> <<< 49116 1727204679.39424: stdout chunk (state=3): >>>import 'os' # <<< 49116 1727204679.39427: stdout chunk (state=3): >>> <<< 49116 1727204679.39448: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 49116 1727204679.39484: stdout chunk (state=3): >>> Processing user site-packages <<< 49116 1727204679.39500: stdout chunk (state=3): >>>Processing global site-packages<<< 49116 1727204679.39517: stdout chunk (state=3): >>> Adding directory: '/usr/local/lib/python3.12/site-packages'<<< 49116 1727204679.39710: stdout chunk (state=3): >>> Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0349d11c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 49116 1727204679.39731: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204679.39756: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0349d20c0><<< 49116 1727204679.39761: stdout chunk (state=3): >>> <<< 49116 1727204679.39816: stdout chunk (state=3): >>>import 'site' # <<< 49116 1727204679.39871: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux<<< 49116 1727204679.39882: stdout chunk (state=3): >>> Type "help", "copyright", "credits" or "license" for more information.<<< 49116 1727204679.39888: stdout chunk (state=3): >>> <<< 49116 1727204679.40605: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py<<< 49116 1727204679.40609: stdout chunk (state=3): >>> <<< 49116 1727204679.40641: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc'<<< 49116 1727204679.40643: stdout chunk (state=3): >>> <<< 49116 1727204679.40681: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py<<< 49116 1727204679.40688: stdout chunk (state=3): >>> <<< 49116 1727204679.40711: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc'<<< 49116 1727204679.40716: stdout chunk (state=3): >>> <<< 49116 1727204679.40753: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py<<< 49116 1727204679.40757: stdout chunk (state=3): >>> <<< 49116 1727204679.40838: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc'<<< 49116 1727204679.40842: stdout chunk (state=3): >>> <<< 49116 1727204679.40875: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py<<< 49116 1727204679.40878: stdout chunk (state=3): >>> <<< 49116 1727204679.40918: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc'<<< 49116 1727204679.40949: stdout chunk (state=3): >>> import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a0ffb0> <<< 49116 1727204679.40983: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py<<< 49116 1727204679.41021: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc'<<< 49116 1727204679.41024: stdout chunk (state=3): >>> <<< 49116 1727204679.41055: stdout chunk (state=3): >>>import '_operator' # <<< 49116 1727204679.41080: stdout chunk (state=3): >>> import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a24140> <<< 49116 1727204679.41158: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 49116 1727204679.41206: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 49116 1727204679.41295: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204679.41364: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 49116 1727204679.41381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 49116 1727204679.41428: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a47950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 49116 1727204679.41454: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 49116 1727204679.41497: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a47fe0> import '_collections' # <<< 49116 1727204679.41502: stdout chunk (state=3): >>> <<< 49116 1727204679.41584: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a27c20><<< 49116 1727204679.41608: stdout chunk (state=3): >>> import '_functools' # <<< 49116 1727204679.41613: stdout chunk (state=3): >>> <<< 49116 1727204679.41663: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a253a0> <<< 49116 1727204679.41873: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a0d160> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 49116 1727204679.41912: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 49116 1727204679.41943: stdout chunk (state=3): >>>import '_sre' # <<< 49116 1727204679.41983: stdout chunk (state=3): >>> # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py<<< 49116 1727204679.41986: stdout chunk (state=3): >>> <<< 49116 1727204679.42027: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc'<<< 49116 1727204679.42058: stdout chunk (state=3): >>> # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 49116 1727204679.42082: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc'<<< 49116 1727204679.42140: stdout chunk (state=3): >>> import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a6b8f0><<< 49116 1727204679.42144: stdout chunk (state=3): >>> <<< 49116 1727204679.42172: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a6a510><<< 49116 1727204679.42177: stdout chunk (state=3): >>> <<< 49116 1727204679.42217: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py<<< 49116 1727204679.42228: stdout chunk (state=3): >>> <<< 49116 1727204679.42238: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 49116 1727204679.42248: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a26240> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a68d70><<< 49116 1727204679.42302: stdout chunk (state=3): >>> <<< 49116 1727204679.42340: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py<<< 49116 1727204679.42348: stdout chunk (state=3): >>> <<< 49116 1727204679.42370: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc'<<< 49116 1727204679.42373: stdout chunk (state=3): >>> import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a98980><<< 49116 1727204679.42399: stdout chunk (state=3): >>> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a0c3e0><<< 49116 1727204679.42403: stdout chunk (state=3): >>> <<< 49116 1727204679.42476: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204679.42485: stdout chunk (state=3): >>> <<< 49116 1727204679.42513: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.42516: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034a98e30><<< 49116 1727204679.42518: stdout chunk (state=3): >>> <<< 49116 1727204679.42576: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a98ce0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.42597: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204679.42608: stdout chunk (state=3): >>> <<< 49116 1727204679.42619: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034a990d0> <<< 49116 1727204679.42676: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a0af00> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py<<< 49116 1727204679.42686: stdout chunk (state=3): >>> <<< 49116 1727204679.42695: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204679.42739: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py<<< 49116 1727204679.42743: stdout chunk (state=3): >>> <<< 49116 1727204679.42786: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 49116 1727204679.42826: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a997c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a99490> import 'importlib.machinery' # <<< 49116 1727204679.42876: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py<<< 49116 1727204679.42879: stdout chunk (state=3): >>> <<< 49116 1727204679.42885: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc'<<< 49116 1727204679.42898: stdout chunk (state=3): >>> <<< 49116 1727204679.42919: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a9a6c0><<< 49116 1727204679.42959: stdout chunk (state=3): >>> import 'importlib.util' # <<< 49116 1727204679.42975: stdout chunk (state=3): >>>import 'runpy' # <<< 49116 1727204679.43018: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py<<< 49116 1727204679.43022: stdout chunk (state=3): >>> <<< 49116 1727204679.43085: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 49116 1727204679.43125: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc'<<< 49116 1727204679.43163: stdout chunk (state=3): >>> import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034ab48c0> import 'errno' # <<< 49116 1727204679.43168: stdout chunk (state=3): >>> <<< 49116 1727204679.43200: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204679.43224: stdout chunk (state=3): >>> # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204679.43253: stdout chunk (state=3): >>> import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034ab5fd0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py<<< 49116 1727204679.43257: stdout chunk (state=3): >>> <<< 49116 1727204679.43281: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc'<<< 49116 1727204679.43317: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 49116 1727204679.43337: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 49116 1727204679.43357: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034ab6e40><<< 49116 1727204679.43417: stdout chunk (state=3): >>> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204679.43427: stdout chunk (state=3): >>> <<< 49116 1727204679.43443: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034ab7470><<< 49116 1727204679.43452: stdout chunk (state=3): >>> <<< 49116 1727204679.43471: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034ab6390><<< 49116 1727204679.43513: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 49116 1727204679.43539: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 49116 1727204679.43614: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.43639: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034ab7e60> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034ab7590> <<< 49116 1727204679.43741: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a9a6f0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 49116 1727204679.43790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc'<<< 49116 1727204679.43827: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py<<< 49116 1727204679.43831: stdout chunk (state=3): >>> <<< 49116 1727204679.43869: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc'<<< 49116 1727204679.43875: stdout chunk (state=3): >>> <<< 49116 1727204679.43926: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204679.43968: stdout chunk (state=3): >>> import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0347fbce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 49116 1727204679.43974: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc'<<< 49116 1727204679.44017: stdout chunk (state=3): >>> # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204679.44033: stdout chunk (state=3): >>> <<< 49116 1727204679.44047: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034828860><<< 49116 1727204679.44056: stdout chunk (state=3): >>> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0348285c0><<< 49116 1727204679.44092: stdout chunk (state=3): >>> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204679.44110: stdout chunk (state=3): >>> <<< 49116 1727204679.44117: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.44124: stdout chunk (state=3): >>>import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034828800><<< 49116 1727204679.44173: stdout chunk (state=3): >>> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204679.44180: stdout chunk (state=3): >>> <<< 49116 1727204679.44187: stdout chunk (state=3): >>># extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.44220: stdout chunk (state=3): >>>import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0348289e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0347f9e80><<< 49116 1727204679.44225: stdout chunk (state=3): >>> <<< 49116 1727204679.44256: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py<<< 49116 1727204679.44404: stdout chunk (state=3): >>> <<< 49116 1727204679.44462: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc'<<< 49116 1727204679.44468: stdout chunk (state=3): >>> <<< 49116 1727204679.44501: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py<<< 49116 1727204679.44505: stdout chunk (state=3): >>> <<< 49116 1727204679.44530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc'<<< 49116 1727204679.44537: stdout chunk (state=3): >>> <<< 49116 1727204679.44554: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd03482a090><<< 49116 1727204679.44559: stdout chunk (state=3): >>> <<< 49116 1727204679.44609: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034828d10><<< 49116 1727204679.44612: stdout chunk (state=3): >>> <<< 49116 1727204679.44646: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a9ade0><<< 49116 1727204679.44651: stdout chunk (state=3): >>> <<< 49116 1727204679.44692: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py<<< 49116 1727204679.44696: stdout chunk (state=3): >>> <<< 49116 1727204679.44789: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc'<<< 49116 1727204679.44793: stdout chunk (state=3): >>> <<< 49116 1727204679.44835: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py<<< 49116 1727204679.44837: stdout chunk (state=3): >>> <<< 49116 1727204679.44909: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc'<<< 49116 1727204679.44912: stdout chunk (state=3): >>> <<< 49116 1727204679.44958: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0348523c0><<< 49116 1727204679.45041: stdout chunk (state=3): >>> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 49116 1727204679.45074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204679.45117: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 49116 1727204679.45164: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc'<<< 49116 1727204679.45169: stdout chunk (state=3): >>> <<< 49116 1727204679.45249: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd03486a510><<< 49116 1727204679.45284: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py<<< 49116 1727204679.45290: stdout chunk (state=3): >>> <<< 49116 1727204679.45362: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 49116 1727204679.45500: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py<<< 49116 1727204679.45512: stdout chunk (state=3): >>> <<< 49116 1727204679.45518: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204679.45557: stdout chunk (state=3): >>>import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0348a72c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py<<< 49116 1727204679.45566: stdout chunk (state=3): >>> <<< 49116 1727204679.45635: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 49116 1727204679.45739: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc'<<< 49116 1727204679.45743: stdout chunk (state=3): >>> <<< 49116 1727204679.45912: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0348c9a60><<< 49116 1727204679.45919: stdout chunk (state=3): >>> <<< 49116 1727204679.46037: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0348a73e0> <<< 49116 1727204679.46117: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd03486b170><<< 49116 1727204679.46121: stdout chunk (state=3): >>> <<< 49116 1727204679.46164: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py<<< 49116 1727204679.46168: stdout chunk (state=3): >>> <<< 49116 1727204679.46208: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0346a4440> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034869550><<< 49116 1727204679.46214: stdout chunk (state=3): >>> <<< 49116 1727204679.46226: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd03482af90><<< 49116 1727204679.46417: stdout chunk (state=3): >>> <<< 49116 1727204679.46559: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 49116 1727204679.46580: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fd0346a46e0> <<< 49116 1727204679.46874: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_r6v2py4s/ansible_setup_payload.zip'<<< 49116 1727204679.46885: stdout chunk (state=3): >>> <<< 49116 1727204679.46899: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.47191: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.47247: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 49116 1727204679.47281: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 49116 1727204679.47326: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 49116 1727204679.47461: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc'<<< 49116 1727204679.47468: stdout chunk (state=3): >>> <<< 49116 1727204679.47508: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py<<< 49116 1727204679.47511: stdout chunk (state=3): >>> <<< 49116 1727204679.47521: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc'<<< 49116 1727204679.47540: stdout chunk (state=3): >>> import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0347121e0><<< 49116 1727204679.47544: stdout chunk (state=3): >>> <<< 49116 1727204679.47591: stdout chunk (state=3): >>>import '_typing' # <<< 49116 1727204679.47904: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0346e90d0><<< 49116 1727204679.47937: stdout chunk (state=3): >>> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0346e8230> # zipimport: zlib available <<< 49116 1727204679.48063: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 49116 1727204679.48111: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available<<< 49116 1727204679.48131: stdout chunk (state=3): >>> <<< 49116 1727204679.50822: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.52985: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 49116 1727204679.52990: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 49116 1727204679.53020: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0346eb5f0> <<< 49116 1727204679.53061: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 49116 1727204679.53085: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204679.53130: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 49116 1727204679.53155: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 49116 1727204679.53200: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 49116 1727204679.53222: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc'<<< 49116 1727204679.53225: stdout chunk (state=3): >>> <<< 49116 1727204679.53254: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204679.53338: stdout chunk (state=3): >>> # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034741ca0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034741a30><<< 49116 1727204679.53357: stdout chunk (state=3): >>> <<< 49116 1727204679.53395: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034741340> <<< 49116 1727204679.53496: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 49116 1727204679.53514: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034741790><<< 49116 1727204679.53538: stdout chunk (state=3): >>> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034712e70><<< 49116 1727204679.53548: stdout chunk (state=3): >>> import 'atexit' # <<< 49116 1727204679.53683: stdout chunk (state=3): >>> # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.53697: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0347429c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034742c00><<< 49116 1727204679.53728: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 49116 1727204679.53825: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc'<<< 49116 1727204679.53858: stdout chunk (state=3): >>> import '_locale' # <<< 49116 1727204679.53924: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034743050><<< 49116 1727204679.53956: stdout chunk (state=3): >>> import 'pwd' # <<< 49116 1727204679.54003: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 49116 1727204679.54043: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 49116 1727204679.54104: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345a8e30><<< 49116 1727204679.54170: stdout chunk (state=3): >>> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.54173: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.54217: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0345aaa50> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py<<< 49116 1727204679.54220: stdout chunk (state=3): >>> <<< 49116 1727204679.54245: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc'<<< 49116 1727204679.54316: stdout chunk (state=3): >>> import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345ab410> <<< 49116 1727204679.54350: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 49116 1727204679.54416: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 49116 1727204679.54449: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345ac5f0> <<< 49116 1727204679.54494: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 49116 1727204679.54587: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 49116 1727204679.54705: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345af0e0><<< 49116 1727204679.54720: stdout chunk (state=3): >>> <<< 49116 1727204679.54774: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.54796: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0345af200> <<< 49116 1727204679.54849: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345ad3a0><<< 49116 1727204679.54878: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 49116 1727204679.54937: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc'<<< 49116 1727204679.54978: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py<<< 49116 1727204679.54993: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc'<<< 49116 1727204679.55021: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 49116 1727204679.55087: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc'<<< 49116 1727204679.55106: stdout chunk (state=3): >>> <<< 49116 1727204679.55137: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc'<<< 49116 1727204679.55153: stdout chunk (state=3): >>> import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345b3110> <<< 49116 1727204679.55174: stdout chunk (state=3): >>>import '_tokenize' # <<< 49116 1727204679.55313: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345b1be0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345b1940> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 49116 1727204679.55335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc'<<< 49116 1727204679.55489: stdout chunk (state=3): >>> <<< 49116 1727204679.55524: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345b3e90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345ad8b0> <<< 49116 1727204679.55576: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.55617: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0345f71d0> <<< 49116 1727204679.55642: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py<<< 49116 1727204679.55693: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345f7320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 49116 1727204679.55735: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 49116 1727204679.55836: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.55886: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0345fcef0> <<< 49116 1727204679.55901: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345fcce0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 49116 1727204679.56114: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc'<<< 49116 1727204679.56131: stdout chunk (state=3): >>> <<< 49116 1727204679.56208: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0345ff3b0><<< 49116 1727204679.56219: stdout chunk (state=3): >>> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345fd550><<< 49116 1727204679.56254: stdout chunk (state=3): >>> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 49116 1727204679.56335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc'<<< 49116 1727204679.56367: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 49116 1727204679.56415: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc'<<< 49116 1727204679.56435: stdout chunk (state=3): >>> import '_string' # <<< 49116 1727204679.56502: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034606b10><<< 49116 1727204679.56611: stdout chunk (state=3): >>> <<< 49116 1727204679.56748: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345ff4a0> <<< 49116 1727204679.56885: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.56889: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204679.56899: stdout chunk (state=3): >>> import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034607800> <<< 49116 1727204679.56956: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.56972: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034607aa0> <<< 49116 1727204679.57040: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.57081: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034606ed0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345f7620> <<< 49116 1727204679.57142: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc'<<< 49116 1727204679.57169: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 49116 1727204679.57232: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc'<<< 49116 1727204679.57260: stdout chunk (state=3): >>> # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.57324: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.57327: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd03460b470><<< 49116 1727204679.57506: stdout chunk (state=3): >>> <<< 49116 1727204679.57627: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.57665: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd03460c950><<< 49116 1727204679.57699: stdout chunk (state=3): >>> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034609c10> <<< 49116 1727204679.57745: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204679.57771: stdout chunk (state=3): >>> # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.57844: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd03460afc0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034609850> # zipimport: zlib available<<< 49116 1727204679.57856: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49116 1727204679.58227: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 49116 1727204679.58232: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 49116 1727204679.58271: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 49116 1727204679.58274: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204679.58303: stdout chunk (state=3): >>> # zipimport: zlib available<<< 49116 1727204679.58349: stdout chunk (state=3): >>> import 'ansible.module_utils.common.text' # <<< 49116 1727204679.58352: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49116 1727204679.58714: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.58786: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.59883: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.60959: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 49116 1727204679.61012: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 49116 1727204679.61029: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 49116 1727204679.61054: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 49116 1727204679.61095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc'<<< 49116 1727204679.61105: stdout chunk (state=3): >>> <<< 49116 1727204679.61190: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.61202: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034494b60><<< 49116 1727204679.61359: stdout chunk (state=3): >>> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 49116 1727204679.61424: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034495a00><<< 49116 1727204679.61427: stdout chunk (state=3): >>> <<< 49116 1727204679.61430: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd03460f530><<< 49116 1727204679.61464: stdout chunk (state=3): >>> <<< 49116 1727204679.61520: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 49116 1727204679.61546: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.61596: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.61619: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 49116 1727204679.61652: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204679.61662: stdout chunk (state=3): >>> <<< 49116 1727204679.61936: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.62227: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py<<< 49116 1727204679.62254: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 49116 1727204679.62276: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034495940> # zipimport: zlib available <<< 49116 1727204679.63328: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.64081: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.64204: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.64332: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 49116 1727204679.64355: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.64430: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204679.64444: stdout chunk (state=3): >>> <<< 49116 1727204679.64584: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 49116 1727204679.64588: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.64660: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.64823: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 49116 1727204679.64851: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.64905: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 49116 1727204679.64927: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49116 1727204679.64967: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204679.65041: stdout chunk (state=3): >>> import 'ansible.module_utils.parsing.convert_bool' # <<< 49116 1727204679.65071: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.65514: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204679.65526: stdout chunk (state=3): >>> <<< 49116 1727204679.65950: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py<<< 49116 1727204679.66026: stdout chunk (state=3): >>> <<< 49116 1727204679.66086: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 49116 1727204679.66119: stdout chunk (state=3): >>>import '_ast' # <<< 49116 1727204679.66242: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0344968a0> <<< 49116 1727204679.66268: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.66382: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.66526: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 49116 1727204679.66535: stdout chunk (state=3): >>> import 'ansible.module_utils.common.arg_spec' # <<< 49116 1727204679.66629: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 49116 1727204679.66715: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204679.66819: stdout chunk (state=3): >>> <<< 49116 1727204679.66938: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd03449e3c0><<< 49116 1727204679.66968: stdout chunk (state=3): >>> <<< 49116 1727204679.67031: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204679.67038: stdout chunk (state=3): >>> <<< 49116 1727204679.67081: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd03449ed20> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034497a10> <<< 49116 1727204679.67109: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204679.67128: stdout chunk (state=3): >>> <<< 49116 1727204679.67266: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.locale' # <<< 49116 1727204679.67292: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.67360: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.67442: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.67546: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204679.67673: stdout chunk (state=3): >>> # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 49116 1727204679.67761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204679.67962: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.67970: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd03449da60> <<< 49116 1727204679.68111: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd03449f020> <<< 49116 1727204679.68117: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 49116 1727204679.68135: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204679.68209: stdout chunk (state=3): >>> <<< 49116 1727204679.68280: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204679.68294: stdout chunk (state=3): >>> <<< 49116 1727204679.68389: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.68443: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204679.68446: stdout chunk (state=3): >>> <<< 49116 1727204679.68515: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py<<< 49116 1727204679.68529: stdout chunk (state=3): >>> # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc'<<< 49116 1727204679.68572: stdout chunk (state=3): >>> # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 49116 1727204679.68627: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc'<<< 49116 1727204679.68652: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 49116 1727204679.68787: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 49116 1727204679.68809: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 49116 1727204679.68844: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 49116 1727204679.69015: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345330b0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0344abef0> <<< 49116 1727204679.69225: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0344a6fc0> <<< 49116 1727204679.69229: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0344a6e10> <<< 49116 1727204679.69231: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 49116 1727204679.69233: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204679.69276: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49116 1727204679.69339: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 49116 1727204679.69350: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 49116 1727204679.69457: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available<<< 49116 1727204679.69489: stdout chunk (state=3): >>> # zipimport: zlib available<<< 49116 1727204679.69493: stdout chunk (state=3): >>> import 'ansible.modules' # <<< 49116 1727204679.69532: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49116 1727204679.69646: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204679.69707: stdout chunk (state=3): >>> <<< 49116 1727204679.69779: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.69799: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.69849: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204679.69870: stdout chunk (state=3): >>> <<< 49116 1727204679.69920: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.70087: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 49116 1727204679.70155: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 49116 1727204679.70171: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.70318: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204679.70408: stdout chunk (state=3): >>> <<< 49116 1727204679.70448: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.70491: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204679.70494: stdout chunk (state=3): >>> <<< 49116 1727204679.70570: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 49116 1727204679.70581: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.70918: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.71242: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.71308: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.71394: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py<<< 49116 1727204679.71442: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 49116 1727204679.71477: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 49116 1727204679.71556: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 49116 1727204679.71595: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034535f40> <<< 49116 1727204679.71631: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py<<< 49116 1727204679.71666: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 49116 1727204679.71697: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py<<< 49116 1727204679.71781: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 49116 1727204679.71815: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 49116 1727204679.71853: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 49116 1727204679.71889: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd033abc650> <<< 49116 1727204679.71946: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.71975: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd033abc9b0> <<< 49116 1727204679.72095: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345156a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034514920> <<< 49116 1727204679.72141: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034534620><<< 49116 1727204679.72172: stdout chunk (state=3): >>> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034534a10> <<< 49116 1727204679.72271: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 49116 1727204679.72304: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py<<< 49116 1727204679.72331: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 49116 1727204679.72392: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 49116 1727204679.72396: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 49116 1727204679.72445: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.72473: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd033abfa10> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd033abf2c0><<< 49116 1727204679.72514: stdout chunk (state=3): >>> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204679.72525: stdout chunk (state=3): >>> # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.72560: stdout chunk (state=3): >>>import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd033abf470><<< 49116 1727204679.72585: stdout chunk (state=3): >>> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd033abe6f0> <<< 49116 1727204679.72605: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py<<< 49116 1727204679.72656: stdout chunk (state=3): >>> <<< 49116 1727204679.72843: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd033abfa70> <<< 49116 1727204679.72868: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 49116 1727204679.72966: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd033b225a0> <<< 49116 1727204679.72993: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd033b205c0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034535700> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 49116 1727204679.73054: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.73071: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 49116 1727204679.73177: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.73181: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 49116 1727204679.73211: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.73251: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.73376: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 49116 1727204679.73380: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.73399: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available <<< 49116 1727204679.73419: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 49116 1727204679.73483: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.73600: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 49116 1727204679.73603: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.73674: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 49116 1727204679.73678: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.73754: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.73772: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.73860: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.73923: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 49116 1727204679.73959: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.74831: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.75634: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 49116 1727204679.75728: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.75808: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.75850: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.75903: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 49116 1727204679.75937: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.75965: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.76022: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 49116 1727204679.76047: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.76113: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.76222: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 49116 1727204679.76271: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.76294: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 49116 1727204679.76335: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.76362: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.76412: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 49116 1727204679.76616: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.76696: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 49116 1727204679.76701: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 49116 1727204679.76730: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd033b222d0> <<< 49116 1727204679.76757: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 49116 1727204679.76857: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 49116 1727204679.77038: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd033b234a0> <<< 49116 1727204679.77042: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 49116 1727204679.77123: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.77190: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.77328: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 49116 1727204679.77368: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.77482: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 49116 1727204679.77485: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.77580: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.77640: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 49116 1727204679.77673: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.77772: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.77806: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 49116 1727204679.77880: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.77957: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd033b5a960> <<< 49116 1727204679.78228: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd033b42660> import 'ansible.module_utils.facts.system.python' # <<< 49116 1727204679.78258: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.78379: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.78382: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 49116 1727204679.78428: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.78515: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.78646: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.78828: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 49116 1727204679.78944: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.78961: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available <<< 49116 1727204679.79014: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 49116 1727204679.79032: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 49116 1727204679.79061: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204679.79089: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd03394df10> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd033b438f0> import 'ansible.module_utils.facts.system.user' # <<< 49116 1727204679.79150: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 49116 1727204679.79195: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.79261: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 49116 1727204679.79471: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.79597: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 49116 1727204679.79702: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.79723: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.79832: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.79868: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.79932: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 49116 1727204679.80032: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.80045: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.80146: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.80303: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 49116 1727204679.80451: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.80476: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.80580: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 49116 1727204679.80603: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.80626: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.80680: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.81347: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.82258: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 49116 1727204679.82291: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.82464: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.82755: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 49116 1727204679.82890: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.83015: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 49116 1727204679.83071: stdout chunk (state=3): >>> <<< 49116 1727204679.83093: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.83388: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.83689: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 49116 1727204679.83703: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.83743: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 49116 1727204679.83770: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.83837: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.83916: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 49116 1727204679.84116: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 49116 1727204679.84275: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.84673: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.84789: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 49116 1727204679.84810: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.84842: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.84891: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 49116 1727204679.84918: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.84954: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 49116 1727204679.85081: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.85111: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 49116 1727204679.85129: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.85318: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.85322: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # <<< 49116 1727204679.85324: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.85381: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.85439: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 49116 1727204679.85461: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.85778: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.86254: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 49116 1727204679.86343: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.86434: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 49116 1727204679.86479: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.86534: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 49116 1727204679.86561: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.86596: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.86642: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 49116 1727204679.86654: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.86691: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.86879: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 49116 1727204679.87020: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 49116 1727204679.87037: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.87068: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 49116 1727204679.87082: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.87130: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.87196: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 49116 1727204679.87202: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.87221: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.87249: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.87326: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.87468: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.87517: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.87633: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 49116 1727204679.87653: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 49116 1727204679.87730: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.87824: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 49116 1727204679.88316: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.88605: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 49116 1727204679.88635: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.88729: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.88816: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 49116 1727204679.88846: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.88933: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204679.89029: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.virtual.openbsd' # <<< 49116 1727204679.89051: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.89218: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204679.89241: stdout chunk (state=3): >>> <<< 49116 1727204679.89369: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 49116 1727204679.89413: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.89587: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.89642: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 49116 1727204679.89735: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204679.90290: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 49116 1727204679.90295: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 49116 1727204679.90299: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd033976930> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd033975580> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0339723f0> <<< 49116 1727204679.91286: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_lsb": {}, "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "04", "second": "39", "epoch": "1727204679", "epoch_int": "1727204679", "date": "2024-09-24", "time": "15:04:39", "iso8601_micro": "2024-09-24T19:04:39.905951Z", "iso8601": "2024-09-24T19:04:39Z", "iso8601_basic": "20240924T150439905951", "iso8601_basic_short": "20240924T150439", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 49116 1727204679.92132: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat <<< 49116 1727204679.92175: stdout chunk (state=3): >>># destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 49116 1727204679.92216: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor <<< 49116 1727204679.92257: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr <<< 49116 1727204679.92490: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution <<< 49116 1727204679.92494: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 49116 1727204679.92753: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 49116 1727204679.92784: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 49116 1727204679.92820: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 49116 1727204679.92861: stdout chunk (state=3): >>># destroy ntpath <<< 49116 1727204679.93087: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 49116 1727204679.93130: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 49116 1727204679.93163: stdout chunk (state=3): >>># destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 49116 1727204679.93196: stdout chunk (state=3): >>># destroy shlex # destroy fcntl <<< 49116 1727204679.93213: stdout chunk (state=3): >>># destroy datetime # destroy subprocess # destroy base64 <<< 49116 1727204679.93264: stdout chunk (state=3): >>># destroy _ssl <<< 49116 1727204679.93270: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 49116 1727204679.93284: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json <<< 49116 1727204679.93321: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 49116 1727204679.93336: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 49116 1727204679.93392: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 49116 1727204679.93422: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 49116 1727204679.93487: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing <<< 49116 1727204679.93693: stdout chunk (state=3): >>># cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 49116 1727204679.93889: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 49116 1727204679.93908: stdout chunk (state=3): >>># destroy _collections <<< 49116 1727204679.93941: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 49116 1727204679.93958: stdout chunk (state=3): >>># destroy tokenize <<< 49116 1727204679.93984: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 49116 1727204679.94033: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 49116 1727204679.94316: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 49116 1727204679.94320: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 49116 1727204679.94333: stdout chunk (state=3): >>># clear sys.audit hooks <<< 49116 1727204679.94889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204679.95062: stderr chunk (state=3): >>><<< 49116 1727204679.95077: stdout chunk (state=3): >>><<< 49116 1727204679.95587: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034bfc530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034bcbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034bfeab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0349d11c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0349d20c0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a0ffb0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a24140> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a47950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a47fe0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a27c20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a253a0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a0d160> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a6b8f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a6a510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a26240> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a68d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a98980> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a0c3e0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034a98e30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a98ce0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034a990d0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a0af00> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a997c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a99490> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a9a6c0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034ab48c0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034ab5fd0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034ab6e40> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034ab7470> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034ab6390> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034ab7e60> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034ab7590> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a9a6f0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0347fbce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034828860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0348285c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034828800> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0348289e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0347f9e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd03482a090> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034828d10> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034a9ade0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0348523c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd03486a510> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0348a72c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0348c9a60> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0348a73e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd03486b170> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0346a4440> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034869550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd03482af90> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fd0346a46e0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_r6v2py4s/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0347121e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0346e90d0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0346e8230> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0346eb5f0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034741ca0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034741a30> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034741340> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034741790> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034712e70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0347429c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034742c00> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034743050> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345a8e30> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0345aaa50> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345ab410> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345ac5f0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345af0e0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0345af200> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345ad3a0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345b3110> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345b1be0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345b1940> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345b3e90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345ad8b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0345f71d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345f7320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0345fcef0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345fcce0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd0345ff3b0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345fd550> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034606b10> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345ff4a0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034607800> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034607aa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034606ed0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345f7620> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd03460b470> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd03460c950> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034609c10> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd03460afc0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034609850> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd034494b60> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034495a00> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd03460f530> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034495940> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0344968a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd03449e3c0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd03449ed20> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034497a10> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd03449da60> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd03449f020> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345330b0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0344abef0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0344a6fc0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0344a6e10> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034535f40> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd033abc650> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd033abc9b0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0345156a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034514920> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034534620> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034534a10> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd033abfa10> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd033abf2c0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd033abf470> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd033abe6f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd033abfa70> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd033b225a0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd033b205c0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd034535700> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd033b222d0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd033b234a0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd033b5a960> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd033b42660> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd03394df10> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd033b438f0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd033976930> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd033975580> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd0339723f0> {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_lsb": {}, "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "04", "second": "39", "epoch": "1727204679", "epoch_int": "1727204679", "date": "2024-09-24", "time": "15:04:39", "iso8601_micro": "2024-09-24T19:04:39.905951Z", "iso8601": "2024-09-24T19:04:39Z", "iso8601_basic": "20240924T150439905951", "iso8601_basic_short": "20240924T150439", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 49116 1727204679.96928: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204679.2239277-49314-214511918793933/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204679.96931: _low_level_execute_command(): starting 49116 1727204679.96934: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204679.2239277-49314-214511918793933/ > /dev/null 2>&1 && sleep 0' 49116 1727204679.98564: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204679.98571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204679.98574: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 49116 1727204679.98576: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 49116 1727204679.98579: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 49116 1727204679.98581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204679.98583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204679.98585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204679.98656: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204679.98957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204679.98985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204679.98994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204679.99158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204680.01301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204680.01362: stderr chunk (state=3): >>><<< 49116 1727204680.01367: stdout chunk (state=3): >>><<< 49116 1727204680.01574: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204680.01587: handler run complete 49116 1727204680.01640: variable 'ansible_facts' from source: unknown 49116 1727204680.02072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204680.02392: variable 'ansible_facts' from source: unknown 49116 1727204680.02741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204680.02918: attempt loop complete, returning result 49116 1727204680.02922: _execute() done 49116 1727204680.02924: dumping result to json 49116 1727204680.02938: done dumping result, returning 49116 1727204680.02948: done running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [127b8e07-fff9-02f7-957b-0000000000c0] 49116 1727204680.02954: sending task result for task 127b8e07-fff9-02f7-957b-0000000000c0 ok: [managed-node3] 49116 1727204680.03449: no more pending results, returning what we have 49116 1727204680.03453: results queue empty 49116 1727204680.03454: checking for any_errors_fatal 49116 1727204680.03455: done checking for any_errors_fatal 49116 1727204680.03456: checking for max_fail_percentage 49116 1727204680.03457: done checking for max_fail_percentage 49116 1727204680.03458: checking to see if all hosts have failed and the running result is not ok 49116 1727204680.03459: done checking to see if all hosts have failed 49116 1727204680.03460: getting the remaining hosts for this loop 49116 1727204680.03461: done getting the remaining hosts for this loop 49116 1727204680.03467: getting the next task for host managed-node3 49116 1727204680.03474: done getting next task for host managed-node3 49116 1727204680.03477: ^ task is: TASK: Check if system is ostree 49116 1727204680.03480: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204680.03483: getting variables 49116 1727204680.03484: in VariableManager get_vars() 49116 1727204680.03510: Calling all_inventory to load vars for managed-node3 49116 1727204680.03512: Calling groups_inventory to load vars for managed-node3 49116 1727204680.03515: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204680.03528: Calling all_plugins_play to load vars for managed-node3 49116 1727204680.03530: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204680.03536: Calling groups_plugins_play to load vars for managed-node3 49116 1727204680.04243: done sending task result for task 127b8e07-fff9-02f7-957b-0000000000c0 49116 1727204680.04249: WORKER PROCESS EXITING 49116 1727204680.04395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204680.04941: done with get_vars() 49116 1727204680.04956: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 15:04:40 -0400 (0:00:00.915) 0:00:03.075 ***** 49116 1727204680.05063: entering _queue_task() for managed-node3/stat 49116 1727204680.05497: worker is 1 (out of 1 available) 49116 1727204680.05511: exiting _queue_task() for managed-node3/stat 49116 1727204680.05525: done queuing things up, now waiting for results queue to drain 49116 1727204680.05526: waiting for pending results... 49116 1727204680.05797: running TaskExecutor() for managed-node3/TASK: Check if system is ostree 49116 1727204680.05918: in run() - task 127b8e07-fff9-02f7-957b-0000000000c2 49116 1727204680.05939: variable 'ansible_search_path' from source: unknown 49116 1727204680.05947: variable 'ansible_search_path' from source: unknown 49116 1727204680.06001: calling self._execute() 49116 1727204680.06095: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204680.06110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204680.06171: variable 'omit' from source: magic vars 49116 1727204680.06660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49116 1727204680.06920: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49116 1727204680.06978: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49116 1727204680.07018: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49116 1727204680.07085: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49116 1727204680.07576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49116 1727204680.07580: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49116 1727204680.07583: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204680.07586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49116 1727204680.07645: Evaluated conditional (not __network_is_ostree is defined): True 49116 1727204680.07901: variable 'omit' from source: magic vars 49116 1727204680.07905: variable 'omit' from source: magic vars 49116 1727204680.07907: variable 'omit' from source: magic vars 49116 1727204680.07994: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204680.08038: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204680.08143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204680.08171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204680.08189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204680.08339: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204680.08444: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204680.08448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204680.08578: Set connection var ansible_connection to ssh 49116 1727204680.08600: Set connection var ansible_timeout to 10 49116 1727204680.08613: Set connection var ansible_shell_executable to /bin/sh 49116 1727204680.08622: Set connection var ansible_pipelining to False 49116 1727204680.08628: Set connection var ansible_shell_type to sh 49116 1727204680.08637: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204680.08676: variable 'ansible_shell_executable' from source: unknown 49116 1727204680.08779: variable 'ansible_connection' from source: unknown 49116 1727204680.08788: variable 'ansible_module_compression' from source: unknown 49116 1727204680.08795: variable 'ansible_shell_type' from source: unknown 49116 1727204680.08802: variable 'ansible_shell_executable' from source: unknown 49116 1727204680.08809: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204680.08818: variable 'ansible_pipelining' from source: unknown 49116 1727204680.08825: variable 'ansible_timeout' from source: unknown 49116 1727204680.08832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204680.09218: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49116 1727204680.09240: variable 'omit' from source: magic vars 49116 1727204680.09253: starting attempt loop 49116 1727204680.09261: running the handler 49116 1727204680.09294: _low_level_execute_command(): starting 49116 1727204680.09472: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204680.10939: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204680.10991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204680.11037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204680.11144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204680.11193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204680.11388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204680.13210: stdout chunk (state=3): >>>/root <<< 49116 1727204680.13308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204680.13397: stderr chunk (state=3): >>><<< 49116 1727204680.13409: stdout chunk (state=3): >>><<< 49116 1727204680.13445: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204680.13571: _low_level_execute_command(): starting 49116 1727204680.13575: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204680.1346185-49355-142256031755060 `" && echo ansible-tmp-1727204680.1346185-49355-142256031755060="` echo /root/.ansible/tmp/ansible-tmp-1727204680.1346185-49355-142256031755060 `" ) && sleep 0' 49116 1727204680.14406: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204680.14427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204680.14486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204680.14648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204680.14784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204680.14938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204680.17121: stdout chunk (state=3): >>>ansible-tmp-1727204680.1346185-49355-142256031755060=/root/.ansible/tmp/ansible-tmp-1727204680.1346185-49355-142256031755060 <<< 49116 1727204680.17287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204680.17359: stderr chunk (state=3): >>><<< 49116 1727204680.17371: stdout chunk (state=3): >>><<< 49116 1727204680.17404: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204680.1346185-49355-142256031755060=/root/.ansible/tmp/ansible-tmp-1727204680.1346185-49355-142256031755060 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204680.17572: variable 'ansible_module_compression' from source: unknown 49116 1727204680.17576: ANSIBALLZ: Using lock for stat 49116 1727204680.17578: ANSIBALLZ: Acquiring lock 49116 1727204680.17580: ANSIBALLZ: Lock acquired: 139720119767296 49116 1727204680.17582: ANSIBALLZ: Creating module 49116 1727204680.34409: ANSIBALLZ: Writing module into payload 49116 1727204680.34495: ANSIBALLZ: Writing module 49116 1727204680.34552: ANSIBALLZ: Renaming module 49116 1727204680.34561: ANSIBALLZ: Done creating module 49116 1727204680.34564: variable 'ansible_facts' from source: unknown 49116 1727204680.34629: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204680.1346185-49355-142256031755060/AnsiballZ_stat.py 49116 1727204680.34776: Sending initial data 49116 1727204680.34780: Sent initial data (153 bytes) 49116 1727204680.35422: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204680.35426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204680.35429: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204680.35438: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204680.35523: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204680.35598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 49116 1727204680.38109: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204680.38176: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204680.38249: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmp5f_t9drz /root/.ansible/tmp/ansible-tmp-1727204680.1346185-49355-142256031755060/AnsiballZ_stat.py <<< 49116 1727204680.38252: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204680.1346185-49355-142256031755060/AnsiballZ_stat.py" <<< 49116 1727204680.38323: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmp5f_t9drz" to remote "/root/.ansible/tmp/ansible-tmp-1727204680.1346185-49355-142256031755060/AnsiballZ_stat.py" <<< 49116 1727204680.38326: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204680.1346185-49355-142256031755060/AnsiballZ_stat.py" <<< 49116 1727204680.39031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204680.39113: stderr chunk (state=3): >>><<< 49116 1727204680.39117: stdout chunk (state=3): >>><<< 49116 1727204680.39139: done transferring module to remote 49116 1727204680.39152: _low_level_execute_command(): starting 49116 1727204680.39157: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204680.1346185-49355-142256031755060/ /root/.ansible/tmp/ansible-tmp-1727204680.1346185-49355-142256031755060/AnsiballZ_stat.py && sleep 0' 49116 1727204680.39886: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204680.39891: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204680.39966: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204680.40048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 49116 1727204680.42828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204680.42906: stderr chunk (state=3): >>><<< 49116 1727204680.42909: stdout chunk (state=3): >>><<< 49116 1727204680.42923: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 49116 1727204680.42926: _low_level_execute_command(): starting 49116 1727204680.42933: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204680.1346185-49355-142256031755060/AnsiballZ_stat.py && sleep 0' 49116 1727204680.43476: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204680.43481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204680.43484: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204680.43486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204680.43533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204680.43537: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204680.43539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204680.43627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 49116 1727204680.47213: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 49116 1727204680.47263: stdout chunk (state=3): >>>import _imp # builtin <<< 49116 1727204680.47290: stdout chunk (state=3): >>>import '_thread' # <<< 49116 1727204680.47295: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 49116 1727204680.47403: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 49116 1727204680.47465: stdout chunk (state=3): >>>import 'posix' # <<< 49116 1727204680.47509: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 49116 1727204680.47544: stdout chunk (state=3): >>>import 'time' # <<< 49116 1727204680.47551: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 49116 1727204680.47633: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204680.47682: stdout chunk (state=3): >>>import '_codecs' # <<< 49116 1727204680.47701: stdout chunk (state=3): >>>import 'codecs' # <<< 49116 1727204680.47751: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 49116 1727204680.47791: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 49116 1727204680.47810: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a314c0530> <<< 49116 1727204680.47903: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3148fb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a314c2ab0> import '_signal' # import '_abc' # import 'abc' # <<< 49116 1727204680.47927: stdout chunk (state=3): >>>import 'io' # <<< 49116 1727204680.47967: stdout chunk (state=3): >>>import '_stat' # <<< 49116 1727204680.47974: stdout chunk (state=3): >>>import 'stat' # <<< 49116 1727204680.48113: stdout chunk (state=3): >>>import '_collections_abc' # <<< 49116 1727204680.48158: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 49116 1727204680.48191: stdout chunk (state=3): >>>import 'os' # <<< 49116 1727204680.48219: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 49116 1727204680.48238: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 49116 1727204680.48513: stdout chunk (state=3): >>>Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31271190><<< 49116 1727204680.48537: stdout chunk (state=3): >>> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31272090> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 49116 1727204680.48926: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 49116 1727204680.48956: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 49116 1727204680.48977: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204680.48992: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 49116 1727204680.49061: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 49116 1727204680.49083: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 49116 1727204680.49109: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 49116 1727204680.49122: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a312aff50> <<< 49116 1727204680.49149: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 49116 1727204680.49169: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 49116 1727204680.49204: stdout chunk (state=3): >>>import '_operator' # <<< 49116 1727204680.49209: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a312c40b0> <<< 49116 1727204680.49236: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 49116 1727204680.49271: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 49116 1727204680.49308: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 49116 1727204680.49503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a312e78f0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a312e7f80> import '_collections' # <<< 49116 1727204680.49555: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a312c7bf0> <<< 49116 1727204680.49559: stdout chunk (state=3): >>>import '_functools' # <<< 49116 1727204680.49612: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a312c52e0> <<< 49116 1727204680.49763: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a312ad100> <<< 49116 1727204680.49806: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 49116 1727204680.49827: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 49116 1727204680.49845: stdout chunk (state=3): >>>import '_sre' # <<< 49116 1727204680.49872: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 49116 1727204680.49916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 49116 1727204680.49939: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 49116 1727204680.49944: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 49116 1727204680.49988: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3130b890> <<< 49116 1727204680.50018: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3130a4b0> <<< 49116 1727204680.50214: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a312c61b0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31308d10> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3133c8c0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a312ac380> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a3133cd70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3133cc20> <<< 49116 1727204680.50264: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204680.50275: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a3133d010> <<< 49116 1727204680.50277: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a312aaea0> <<< 49116 1727204680.50310: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 49116 1727204680.50315: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204680.50350: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 49116 1727204680.50385: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 49116 1727204680.50404: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3133d700> <<< 49116 1727204680.50408: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3133d3d0> import 'importlib.machinery' # <<< 49116 1727204680.50457: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 49116 1727204680.50483: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3133e600> <<< 49116 1727204680.50502: stdout chunk (state=3): >>>import 'importlib.util' # <<< 49116 1727204680.50517: stdout chunk (state=3): >>>import 'runpy' # <<< 49116 1727204680.50547: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 49116 1727204680.50717: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31354830> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a31355f70> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 49116 1727204680.50721: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 49116 1727204680.50752: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 49116 1727204680.50769: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 49116 1727204680.50779: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31356de0> <<< 49116 1727204680.50825: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204680.50836: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a31357410> <<< 49116 1727204680.50845: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31356360> <<< 49116 1727204680.50861: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 49116 1727204680.50887: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 49116 1727204680.50924: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204680.50944: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a31357e90> <<< 49116 1727204680.51055: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a313575c0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3133e660> <<< 49116 1727204680.51058: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 49116 1727204680.51103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 49116 1727204680.51127: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 49116 1727204680.51213: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a310e3d10> <<< 49116 1727204680.51222: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 49116 1727204680.51254: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204680.51263: stdout chunk (state=3): >>>import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a3110c860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3110c5c0> <<< 49116 1727204680.51288: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204680.51293: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a3110c890> <<< 49116 1727204680.51510: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a3110ca70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a310e1eb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 49116 1727204680.51585: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 49116 1727204680.51620: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 49116 1727204680.51667: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3110e120> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3110cda0> <<< 49116 1727204680.51733: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3133e7e0> <<< 49116 1727204680.51741: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 49116 1727204680.51803: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204680.51837: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 49116 1727204680.51880: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 49116 1727204680.51926: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31136480> <<< 49116 1727204680.51989: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 49116 1727204680.52026: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204680.52048: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 49116 1727204680.52086: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 49116 1727204680.52163: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31152600> <<< 49116 1727204680.52197: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 49116 1727204680.52235: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 49116 1727204680.52417: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31187350> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 49116 1727204680.52424: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 49116 1727204680.52507: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 49116 1727204680.52572: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 49116 1727204680.52673: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a311b1af0> <<< 49116 1727204680.52784: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31187470> <<< 49116 1727204680.52844: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31153290> <<< 49116 1727204680.52884: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30fcc470> <<< 49116 1727204680.53008: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31151670> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3110f020> <<< 49116 1727204680.53091: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 49116 1727204680.53110: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9a30fcc710> <<< 49116 1727204680.53234: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_lcvczkx2/ansible_stat_payload.zip' <<< 49116 1727204680.53248: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204680.53505: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204680.53527: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 49116 1727204680.53530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 49116 1727204680.53593: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 49116 1727204680.53711: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 49116 1727204680.53816: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31026210> import '_typing' # <<< 49116 1727204680.54079: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30ffd100> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30ffc290> <<< 49116 1727204680.54091: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204680.54124: stdout chunk (state=3): >>>import 'ansible' # <<< 49116 1727204680.54157: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 49116 1727204680.54197: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204680.54309: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 49116 1727204680.57004: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204680.59225: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 49116 1727204680.59235: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30fff620> <<< 49116 1727204680.59276: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 49116 1727204680.59287: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204680.59321: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 49116 1727204680.59345: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 49116 1727204680.59379: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 49116 1727204680.59416: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204680.59624: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a31051b20> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31051910> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31051220> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31051670> <<< 49116 1727204680.59660: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31026c30> <<< 49116 1727204680.59663: stdout chunk (state=3): >>>import 'atexit' # <<< 49116 1727204680.59716: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204680.59739: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a310528d0> <<< 49116 1727204680.59767: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204680.59793: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a31052ae0> <<< 49116 1727204680.59847: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 49116 1727204680.59955: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 49116 1727204680.59980: stdout chunk (state=3): >>>import '_locale' # <<< 49116 1727204680.60068: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31052ff0><<< 49116 1727204680.60087: stdout chunk (state=3): >>> import 'pwd' # <<< 49116 1727204680.60120: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py<<< 49116 1727204680.60130: stdout chunk (state=3): >>> <<< 49116 1727204680.60175: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc'<<< 49116 1727204680.60245: stdout chunk (state=3): >>> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30eb4d70><<< 49116 1727204680.60259: stdout chunk (state=3): >>> <<< 49116 1727204680.60293: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204680.60327: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30eb6960> <<< 49116 1727204680.60398: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc'<<< 49116 1727204680.60417: stdout chunk (state=3): >>> <<< 49116 1727204680.60470: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30eb7320><<< 49116 1727204680.60505: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py<<< 49116 1727204680.60515: stdout chunk (state=3): >>> <<< 49116 1727204680.60595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30eb8500><<< 49116 1727204680.60604: stdout chunk (state=3): >>> <<< 49116 1727204680.60628: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py<<< 49116 1727204680.60706: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 49116 1727204680.60843: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 49116 1727204680.60869: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30ebaff0><<< 49116 1727204680.60956: stdout chunk (state=3): >>> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204680.60971: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30ebb110> <<< 49116 1727204680.60999: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30eb92b0><<< 49116 1727204680.61079: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 49116 1727204680.61102: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 49116 1727204680.61146: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc'<<< 49116 1727204680.61205: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 49116 1727204680.61278: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py<<< 49116 1727204680.61300: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc'<<< 49116 1727204680.61345: stdout chunk (state=3): >>> import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30ebeea0><<< 49116 1727204680.61368: stdout chunk (state=3): >>> import '_tokenize' # <<< 49116 1727204680.61487: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30ebd970> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30ebd6d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py<<< 49116 1727204680.61517: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc'<<< 49116 1727204680.61609: stdout chunk (state=3): >>> <<< 49116 1727204680.61691: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30ebfda0> <<< 49116 1727204680.61753: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30eb9730> <<< 49116 1727204680.61790: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204680.61836: stdout chunk (state=3): >>> # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204680.61839: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30f06ff0><<< 49116 1727204680.61919: stdout chunk (state=3): >>> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 49116 1727204680.61922: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc'<<< 49116 1727204680.61961: stdout chunk (state=3): >>> import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30f07110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 49116 1727204680.62019: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 49116 1727204680.62050: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py<<< 49116 1727204680.62120: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204680.62146: stdout chunk (state=3): >>> # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204680.62169: stdout chunk (state=3): >>>import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30f0cd40> <<< 49116 1727204680.62482: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30f0cb00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 49116 1727204680.62486: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 49116 1727204680.62582: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204680.62585: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30f0f260> <<< 49116 1727204680.62607: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30f0d430> <<< 49116 1727204680.62653: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py<<< 49116 1727204680.62669: stdout chunk (state=3): >>> <<< 49116 1727204680.62779: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py<<< 49116 1727204680.62797: stdout chunk (state=3): >>> <<< 49116 1727204680.62824: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 49116 1727204680.63110: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30f12a50> <<< 49116 1727204680.63146: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30f0f410> <<< 49116 1727204680.63268: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204680.63297: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30f13d70><<< 49116 1727204680.63304: stdout chunk (state=3): >>> <<< 49116 1727204680.63359: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204680.63369: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204680.63385: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30f13c20> <<< 49116 1727204680.63469: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204680.63486: stdout chunk (state=3): >>> <<< 49116 1727204680.63492: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204680.63501: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30f13e90><<< 49116 1727204680.63510: stdout chunk (state=3): >>> <<< 49116 1727204680.63541: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30f07410><<< 49116 1727204680.63546: stdout chunk (state=3): >>> <<< 49116 1727204680.63583: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py<<< 49116 1727204680.63588: stdout chunk (state=3): >>> <<< 49116 1727204680.63610: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc'<<< 49116 1727204680.63616: stdout chunk (state=3): >>> <<< 49116 1727204680.63653: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py<<< 49116 1727204680.63661: stdout chunk (state=3): >>> <<< 49116 1727204680.63752: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204680.63757: stdout chunk (state=3): >>> <<< 49116 1727204680.63812: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204680.63818: stdout chunk (state=3): >>> import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30f174d0><<< 49116 1727204680.63834: stdout chunk (state=3): >>> <<< 49116 1727204680.64113: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204680.64119: stdout chunk (state=3): >>> import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30f18680><<< 49116 1727204680.64137: stdout chunk (state=3): >>> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30f15c40><<< 49116 1727204680.64187: stdout chunk (state=3): >>> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204680.64204: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204680.64224: stdout chunk (state=3): >>> import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30f16ff0><<< 49116 1727204680.64233: stdout chunk (state=3): >>> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30f158b0> <<< 49116 1727204680.64287: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 49116 1727204680.64298: stdout chunk (state=3): >>> import 'ansible.module_utils.compat' # <<< 49116 1727204680.64336: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49116 1727204680.64500: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204680.64602: stdout chunk (state=3): >>> <<< 49116 1727204680.64668: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204680.64695: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204680.64724: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 49116 1727204680.64754: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204680.64760: stdout chunk (state=3): >>> <<< 49116 1727204680.64786: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204680.64791: stdout chunk (state=3): >>> <<< 49116 1727204680.64818: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 49116 1727204680.64849: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204680.64898: stdout chunk (state=3): >>> <<< 49116 1727204680.65087: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204680.65316: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204680.65327: stdout chunk (state=3): >>> <<< 49116 1727204680.66523: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204680.67499: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 49116 1727204680.67521: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 49116 1727204680.67548: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 49116 1727204680.67553: stdout chunk (state=3): >>> <<< 49116 1727204680.67573: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 49116 1727204680.67619: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py<<< 49116 1727204680.67622: stdout chunk (state=3): >>> <<< 49116 1727204680.67659: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc'<<< 49116 1727204680.67667: stdout chunk (state=3): >>> <<< 49116 1727204680.67752: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204680.67768: stdout chunk (state=3): >>> # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204680.67920: stdout chunk (state=3): >>> import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30fa08f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py<<< 49116 1727204680.67927: stdout chunk (state=3): >>> <<< 49116 1727204680.67942: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 49116 1727204680.67982: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30fa1e50><<< 49116 1727204680.67985: stdout chunk (state=3): >>> <<< 49116 1727204680.68017: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30f18500> <<< 49116 1727204680.68087: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 49116 1727204680.68091: stdout chunk (state=3): >>> <<< 49116 1727204680.68129: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204680.68227: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204680.68231: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 49116 1727204680.68247: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204680.68249: stdout chunk (state=3): >>> <<< 49116 1727204680.68525: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204680.68811: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 49116 1727204680.68864: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30fa1f40> # zipimport: zlib available <<< 49116 1727204680.69821: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204680.70605: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204680.70738: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204680.70742: stdout chunk (state=3): >>> <<< 49116 1727204680.70867: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 49116 1727204680.70876: stdout chunk (state=3): >>> <<< 49116 1727204680.70893: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204680.70956: stdout chunk (state=3): >>> # zipimport: zlib available<<< 49116 1727204680.70960: stdout chunk (state=3): >>> <<< 49116 1727204680.71020: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 49116 1727204680.71059: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204680.71211: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204680.71227: stdout chunk (state=3): >>> <<< 49116 1727204680.71375: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 49116 1727204680.71394: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 49116 1727204680.71430: stdout chunk (state=3): >>> # zipimport: zlib available<<< 49116 1727204680.71448: stdout chunk (state=3): >>> <<< 49116 1727204680.71573: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # <<< 49116 1727204680.71577: stdout chunk (state=3): >>> <<< 49116 1727204680.71603: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204680.72031: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204680.72050: stdout chunk (state=3): >>> <<< 49116 1727204680.72458: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py<<< 49116 1727204680.72577: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 49116 1727204680.72613: stdout chunk (state=3): >>>import '_ast' # <<< 49116 1727204680.72771: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30fa28d0> # zipimport: zlib available <<< 49116 1727204680.72896: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204680.72905: stdout chunk (state=3): >>> <<< 49116 1727204680.73026: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 49116 1727204680.73037: stdout chunk (state=3): >>> <<< 49116 1727204680.73059: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 49116 1727204680.73092: stdout chunk (state=3): >>> import 'ansible.module_utils.common.arg_spec' # <<< 49116 1727204680.73143: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc'<<< 49116 1727204680.73261: stdout chunk (state=3): >>> # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204680.73268: stdout chunk (state=3): >>> <<< 49116 1727204680.73477: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204680.73487: stdout chunk (state=3): >>> import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30dae330><<< 49116 1727204680.73500: stdout chunk (state=3): >>> <<< 49116 1727204680.73576: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204680.73587: stdout chunk (state=3): >>> <<< 49116 1727204680.73606: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 49116 1727204680.73622: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30daec60> <<< 49116 1727204680.73636: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30fa3800> <<< 49116 1727204680.73670: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204680.73675: stdout chunk (state=3): >>> <<< 49116 1727204680.73746: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204680.73815: stdout chunk (state=3): >>> import 'ansible.module_utils.common.locale' # <<< 49116 1727204680.73821: stdout chunk (state=3): >>> <<< 49116 1727204680.73850: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204680.73927: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204680.74007: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204680.74106: stdout chunk (state=3): >>> # zipimport: zlib available<<< 49116 1727204680.74110: stdout chunk (state=3): >>> <<< 49116 1727204680.74235: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py<<< 49116 1727204680.74238: stdout chunk (state=3): >>> <<< 49116 1727204680.74321: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc'<<< 49116 1727204680.74327: stdout chunk (state=3): >>> <<< 49116 1727204680.74481: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 49116 1727204680.74493: stdout chunk (state=3): >>> <<< 49116 1727204680.74517: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30dada90><<< 49116 1727204680.74523: stdout chunk (state=3): >>> <<< 49116 1727204680.74610: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30daeea0><<< 49116 1727204680.74656: stdout chunk (state=3): >>> import 'ansible.module_utils.common.file' # <<< 49116 1727204680.74671: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 49116 1727204680.74699: stdout chunk (state=3): >>> # zipimport: zlib available<<< 49116 1727204680.74704: stdout chunk (state=3): >>> <<< 49116 1727204680.74948: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 49116 1727204680.74986: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204680.74996: stdout chunk (state=3): >>> <<< 49116 1727204680.75148: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 49116 1727204680.75151: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 49116 1727204680.75154: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py<<< 49116 1727204680.75186: stdout chunk (state=3): >>> <<< 49116 1727204680.75190: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 49116 1727204680.75229: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py<<< 49116 1727204680.75324: stdout chunk (state=3): >>> <<< 49116 1727204680.75351: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 49116 1727204680.75374: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py<<< 49116 1727204680.75401: stdout chunk (state=3): >>> <<< 49116 1727204680.75426: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 49116 1727204680.75556: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30e42f90> <<< 49116 1727204680.75653: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30dbbe30><<< 49116 1727204680.75777: stdout chunk (state=3): >>> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30db6ed0> <<< 49116 1727204680.75804: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30db6d20> # destroy ansible.module_utils.distro<<< 49116 1727204680.75821: stdout chunk (state=3): >>> import 'ansible.module_utils.distro' # <<< 49116 1727204680.75847: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204680.75852: stdout chunk (state=3): >>> <<< 49116 1727204680.75908: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204680.75919: stdout chunk (state=3): >>> <<< 49116 1727204680.75956: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 49116 1727204680.75962: stdout chunk (state=3): >>> <<< 49116 1727204680.75978: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 49116 1727204680.76076: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 49116 1727204680.76084: stdout chunk (state=3): >>> <<< 49116 1727204680.76104: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204680.76112: stdout chunk (state=3): >>> <<< 49116 1727204680.76152: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available<<< 49116 1727204680.76159: stdout chunk (state=3): >>> <<< 49116 1727204680.76500: stdout chunk (state=3): >>># zipimport: zlib available <<< 49116 1727204680.76761: stdout chunk (state=3): >>># zipimport: zlib available<<< 49116 1727204680.76773: stdout chunk (state=3): >>> <<< 49116 1727204680.76980: stdout chunk (state=3): >>> <<< 49116 1727204680.76993: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}}<<< 49116 1727204680.77003: stdout chunk (state=3): >>> <<< 49116 1727204680.77035: stdout chunk (state=3): >>># destroy __main__<<< 49116 1727204680.77038: stdout chunk (state=3): >>> <<< 49116 1727204680.77551: stdout chunk (state=3): >>># clear sys.path_importer_cache<<< 49116 1727204680.77565: stdout chunk (state=3): >>> <<< 49116 1727204680.77577: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._<<< 49116 1727204680.77598: stdout chunk (state=3): >>> # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value<<< 49116 1727204680.77620: stdout chunk (state=3): >>> # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys<<< 49116 1727204680.77653: stdout chunk (state=3): >>> # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs<<< 49116 1727204680.77729: stdout chunk (state=3): >>> # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect <<< 49116 1727204680.77738: stdout chunk (state=3): >>># cleanup[2] removing _random<<< 49116 1727204680.77755: stdout chunk (state=3): >>> # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading<<< 49116 1727204680.77777: stdout chunk (state=3): >>> # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob<<< 49116 1727204680.77801: stdout chunk (state=3): >>> # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil <<< 49116 1727204680.77821: stdout chunk (state=3): >>># destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder<<< 49116 1727204680.77863: stdout chunk (state=3): >>> # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal<<< 49116 1727204680.77873: stdout chunk (state=3): >>> # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback<<< 49116 1727204680.77897: stdout chunk (state=3): >>> # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string <<< 49116 1727204680.77916: stdout chunk (state=3): >>># cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket<<< 49116 1727204680.77948: stdout chunk (state=3): >>> # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six<<< 49116 1727204680.77959: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text<<< 49116 1727204680.77988: stdout chunk (state=3): >>> # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing<<< 49116 1727204680.77994: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast<<< 49116 1727204680.78021: stdout chunk (state=3): >>> # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec <<< 49116 1727204680.78037: stdout chunk (state=3): >>># cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux<<< 49116 1727204680.78108: stdout chunk (state=3): >>> # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 49116 1727204680.78520: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 49116 1727204680.78547: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util<<< 49116 1727204680.78587: stdout chunk (state=3): >>> # destroy _bz2 <<< 49116 1727204680.78622: stdout chunk (state=3): >>># destroy _compression # destroy _lzma # destroy binascii <<< 49116 1727204680.78655: stdout chunk (state=3): >>># destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path<<< 49116 1727204680.78670: stdout chunk (state=3): >>> # destroy zipfile # destroy pathlib<<< 49116 1727204680.78728: stdout chunk (state=3): >>> # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath <<< 49116 1727204680.78790: stdout chunk (state=3): >>># destroy importlib # destroy zipimport<<< 49116 1727204680.78797: stdout chunk (state=3): >>> # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner<<< 49116 1727204680.78877: stdout chunk (state=3): >>> # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd<<< 49116 1727204680.78880: stdout chunk (state=3): >>> # destroy locale # destroy signal # destroy fcntl<<< 49116 1727204680.78896: stdout chunk (state=3): >>> # destroy select # destroy _signal<<< 49116 1727204680.78902: stdout chunk (state=3): >>> # destroy _posixsubprocess # destroy syslog<<< 49116 1727204680.78944: stdout chunk (state=3): >>> # destroy uuid # destroy selectors <<< 49116 1727204680.78973: stdout chunk (state=3): >>># destroy errno<<< 49116 1727204680.78991: stdout chunk (state=3): >>> # destroy array <<< 49116 1727204680.79039: stdout chunk (state=3): >>># destroy datetime # destroy _hashlib # destroy _blake2<<< 49116 1727204680.79096: stdout chunk (state=3): >>> # destroy selinux<<< 49116 1727204680.79100: stdout chunk (state=3): >>> # destroy shutil # destroy distro<<< 49116 1727204680.79114: stdout chunk (state=3): >>> # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex<<< 49116 1727204680.79178: stdout chunk (state=3): >>> # destroy subprocess # cleanup[3] wiping selinux._selinux<<< 49116 1727204680.79200: stdout chunk (state=3): >>> # cleanup[3] wiping ctypes._endian<<< 49116 1727204680.79237: stdout chunk (state=3): >>> <<< 49116 1727204680.79240: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc<<< 49116 1727204680.79243: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves<<< 49116 1727204680.79307: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128<<< 49116 1727204680.79313: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string<<< 49116 1727204680.79316: stdout chunk (state=3): >>> # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize<<< 49116 1727204680.79347: stdout chunk (state=3): >>> # cleanup[3] wiping _tokenize<<< 49116 1727204680.79351: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit<<< 49116 1727204680.79353: stdout chunk (state=3): >>> # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc<<< 49116 1727204680.79379: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref<<< 49116 1727204680.79404: stdout chunk (state=3): >>> # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings<<< 49116 1727204680.79423: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap_external<<< 49116 1727204680.79427: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct<<< 49116 1727204680.79447: stdout chunk (state=3): >>> # cleanup[3] wiping re # destroy re._constants<<< 49116 1727204680.79452: stdout chunk (state=3): >>> # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg<<< 49116 1727204680.79478: stdout chunk (state=3): >>> # cleanup[3] wiping re._parser # cleanup[3] wiping _sre<<< 49116 1727204680.79488: stdout chunk (state=3): >>> # cleanup[3] wiping functools # cleanup[3] wiping _functools<<< 49116 1727204680.79516: stdout chunk (state=3): >>> # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc<<< 49116 1727204680.79527: stdout chunk (state=3): >>> # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types<<< 49116 1727204680.79567: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath<<< 49116 1727204680.79630: stdout chunk (state=3): >>> # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 49116 1727204680.79644: stdout chunk (state=3): >>># cleanup[3] wiping builtins<<< 49116 1727204680.79682: stdout chunk (state=3): >>> # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal <<< 49116 1727204680.79915: stdout chunk (state=3): >>># destroy _datetime <<< 49116 1727204680.79979: stdout chunk (state=3): >>># destroy sys.monitoring <<< 49116 1727204680.80024: stdout chunk (state=3): >>># destroy _socket <<< 49116 1727204680.80036: stdout chunk (state=3): >>># destroy _collections <<< 49116 1727204680.80089: stdout chunk (state=3): >>># destroy platform<<< 49116 1727204680.80093: stdout chunk (state=3): >>> <<< 49116 1727204680.80095: stdout chunk (state=3): >>># destroy _uuid <<< 49116 1727204680.80147: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser<<< 49116 1727204680.80160: stdout chunk (state=3): >>> # destroy tokenize<<< 49116 1727204680.80166: stdout chunk (state=3): >>> <<< 49116 1727204680.80201: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib<<< 49116 1727204680.80220: stdout chunk (state=3): >>> <<< 49116 1727204680.80223: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib<<< 49116 1727204680.80245: stdout chunk (state=3): >>> <<< 49116 1727204680.80355: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error<<< 49116 1727204680.80384: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response<<< 49116 1727204680.80401: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io<<< 49116 1727204680.80436: stdout chunk (state=3): >>> # destroy marshal # clear sys.meta_path<<< 49116 1727204680.80461: stdout chunk (state=3): >>> # clear sys.modules # destroy _frozen_importlib<<< 49116 1727204680.80682: stdout chunk (state=3): >>> # destroy codecs<<< 49116 1727204680.80695: stdout chunk (state=3): >>> # destroy encodings.aliases <<< 49116 1727204680.80701: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig <<< 49116 1727204680.80707: stdout chunk (state=3): >>># destroy encodings.cp437 <<< 49116 1727204680.80714: stdout chunk (state=3): >>># destroy _codecs # destroy io<<< 49116 1727204680.80778: stdout chunk (state=3): >>> # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 49116 1727204680.80784: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math<<< 49116 1727204680.80787: stdout chunk (state=3): >>> # destroy _bisect # destroy time<<< 49116 1727204680.81124: stdout chunk (state=3): >>> # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 49116 1727204680.81524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204680.81606: stderr chunk (state=3): >>><<< 49116 1727204680.81610: stdout chunk (state=3): >>><<< 49116 1727204680.81678: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a314c0530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3148fb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a314c2ab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31271190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31272090> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a312aff50> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a312c40b0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a312e78f0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a312e7f80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a312c7bf0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a312c52e0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a312ad100> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3130b890> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3130a4b0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a312c61b0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31308d10> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3133c8c0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a312ac380> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a3133cd70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3133cc20> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a3133d010> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a312aaea0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3133d700> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3133d3d0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3133e600> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31354830> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a31355f70> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31356de0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a31357410> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31356360> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a31357e90> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a313575c0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3133e660> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a310e3d10> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a3110c860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3110c5c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a3110c890> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a3110ca70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a310e1eb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3110e120> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3110cda0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3133e7e0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31136480> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31152600> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31187350> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a311b1af0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31187470> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31153290> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30fcc470> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31151670> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a3110f020> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9a30fcc710> # zipimport: found 30 names in '/tmp/ansible_stat_payload_lcvczkx2/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31026210> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30ffd100> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30ffc290> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30fff620> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a31051b20> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31051910> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31051220> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31051670> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31026c30> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a310528d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a31052ae0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a31052ff0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30eb4d70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30eb6960> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30eb7320> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30eb8500> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30ebaff0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30ebb110> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30eb92b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30ebeea0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30ebd970> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30ebd6d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30ebfda0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30eb9730> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30f06ff0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30f07110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30f0cd40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30f0cb00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30f0f260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30f0d430> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30f12a50> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30f0f410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30f13d70> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30f13c20> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30f13e90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30f07410> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30f174d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30f18680> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30f15c40> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30f16ff0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30f158b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30fa08f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30fa1e50> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30f18500> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30fa1f40> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30fa28d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30dae330> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30daec60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30fa3800> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9a30dada90> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30daeea0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30e42f90> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30dbbe30> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30db6ed0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9a30db6d20> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 49116 1727204680.82640: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204680.1346185-49355-142256031755060/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204680.82644: _low_level_execute_command(): starting 49116 1727204680.82667: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204680.1346185-49355-142256031755060/ > /dev/null 2>&1 && sleep 0' 49116 1727204680.82776: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204680.82779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204680.82782: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 49116 1727204680.82785: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204680.82787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204680.82871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204680.82961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 49116 1727204680.85851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204680.85916: stderr chunk (state=3): >>><<< 49116 1727204680.85920: stdout chunk (state=3): >>><<< 49116 1727204680.85938: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 49116 1727204680.85942: handler run complete 49116 1727204680.85960: attempt loop complete, returning result 49116 1727204680.85963: _execute() done 49116 1727204680.85967: dumping result to json 49116 1727204680.85970: done dumping result, returning 49116 1727204680.85980: done running TaskExecutor() for managed-node3/TASK: Check if system is ostree [127b8e07-fff9-02f7-957b-0000000000c2] 49116 1727204680.85985: sending task result for task 127b8e07-fff9-02f7-957b-0000000000c2 49116 1727204680.86086: done sending task result for task 127b8e07-fff9-02f7-957b-0000000000c2 49116 1727204680.86090: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 49116 1727204680.86163: no more pending results, returning what we have 49116 1727204680.86168: results queue empty 49116 1727204680.86169: checking for any_errors_fatal 49116 1727204680.86176: done checking for any_errors_fatal 49116 1727204680.86177: checking for max_fail_percentage 49116 1727204680.86179: done checking for max_fail_percentage 49116 1727204680.86180: checking to see if all hosts have failed and the running result is not ok 49116 1727204680.86181: done checking to see if all hosts have failed 49116 1727204680.86182: getting the remaining hosts for this loop 49116 1727204680.86183: done getting the remaining hosts for this loop 49116 1727204680.86187: getting the next task for host managed-node3 49116 1727204680.86192: done getting next task for host managed-node3 49116 1727204680.86195: ^ task is: TASK: Set flag to indicate system is ostree 49116 1727204680.86198: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204680.86211: getting variables 49116 1727204680.86213: in VariableManager get_vars() 49116 1727204680.86246: Calling all_inventory to load vars for managed-node3 49116 1727204680.86250: Calling groups_inventory to load vars for managed-node3 49116 1727204680.86253: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204680.86265: Calling all_plugins_play to load vars for managed-node3 49116 1727204680.86269: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204680.86272: Calling groups_plugins_play to load vars for managed-node3 49116 1727204680.86447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204680.86768: done with get_vars() 49116 1727204680.86785: done getting variables 49116 1727204680.86934: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 15:04:40 -0400 (0:00:00.818) 0:00:03.894 ***** 49116 1727204680.86960: entering _queue_task() for managed-node3/set_fact 49116 1727204680.86967: Creating lock for set_fact 49116 1727204680.87390: worker is 1 (out of 1 available) 49116 1727204680.87413: exiting _queue_task() for managed-node3/set_fact 49116 1727204680.87428: done queuing things up, now waiting for results queue to drain 49116 1727204680.87429: waiting for pending results... 49116 1727204680.87695: running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree 49116 1727204680.87779: in run() - task 127b8e07-fff9-02f7-957b-0000000000c3 49116 1727204680.87783: variable 'ansible_search_path' from source: unknown 49116 1727204680.87786: variable 'ansible_search_path' from source: unknown 49116 1727204680.87815: calling self._execute() 49116 1727204680.87905: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204680.87910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204680.87912: variable 'omit' from source: magic vars 49116 1727204680.88482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49116 1727204680.88706: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49116 1727204680.88770: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49116 1727204680.88797: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49116 1727204680.88854: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49116 1727204680.88926: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49116 1727204680.88948: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49116 1727204680.88971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204680.88990: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49116 1727204680.89138: Evaluated conditional (not __network_is_ostree is defined): True 49116 1727204680.89141: variable 'omit' from source: magic vars 49116 1727204680.89197: variable 'omit' from source: magic vars 49116 1727204680.89337: variable '__ostree_booted_stat' from source: set_fact 49116 1727204680.89375: variable 'omit' from source: magic vars 49116 1727204680.89402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204680.89480: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204680.89483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204680.89495: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204680.89507: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204680.89540: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204680.89546: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204680.89550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204680.89711: Set connection var ansible_connection to ssh 49116 1727204680.89746: Set connection var ansible_timeout to 10 49116 1727204680.89749: Set connection var ansible_shell_executable to /bin/sh 49116 1727204680.89751: Set connection var ansible_pipelining to False 49116 1727204680.89753: Set connection var ansible_shell_type to sh 49116 1727204680.89755: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204680.89774: variable 'ansible_shell_executable' from source: unknown 49116 1727204680.89779: variable 'ansible_connection' from source: unknown 49116 1727204680.89781: variable 'ansible_module_compression' from source: unknown 49116 1727204680.89784: variable 'ansible_shell_type' from source: unknown 49116 1727204680.89789: variable 'ansible_shell_executable' from source: unknown 49116 1727204680.89792: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204680.89795: variable 'ansible_pipelining' from source: unknown 49116 1727204680.89830: variable 'ansible_timeout' from source: unknown 49116 1727204680.89836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204680.89923: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204680.89939: variable 'omit' from source: magic vars 49116 1727204680.89943: starting attempt loop 49116 1727204680.89946: running the handler 49116 1727204680.89965: handler run complete 49116 1727204680.89975: attempt loop complete, returning result 49116 1727204680.89978: _execute() done 49116 1727204680.89981: dumping result to json 49116 1727204680.89983: done dumping result, returning 49116 1727204680.90001: done running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree [127b8e07-fff9-02f7-957b-0000000000c3] 49116 1727204680.90004: sending task result for task 127b8e07-fff9-02f7-957b-0000000000c3 49116 1727204680.90105: done sending task result for task 127b8e07-fff9-02f7-957b-0000000000c3 49116 1727204680.90109: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 49116 1727204680.90182: no more pending results, returning what we have 49116 1727204680.90185: results queue empty 49116 1727204680.90187: checking for any_errors_fatal 49116 1727204680.90194: done checking for any_errors_fatal 49116 1727204680.90195: checking for max_fail_percentage 49116 1727204680.90197: done checking for max_fail_percentage 49116 1727204680.90198: checking to see if all hosts have failed and the running result is not ok 49116 1727204680.90199: done checking to see if all hosts have failed 49116 1727204680.90200: getting the remaining hosts for this loop 49116 1727204680.90201: done getting the remaining hosts for this loop 49116 1727204680.90205: getting the next task for host managed-node3 49116 1727204680.90213: done getting next task for host managed-node3 49116 1727204680.90216: ^ task is: TASK: Fix CentOS6 Base repo 49116 1727204680.90255: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204680.90259: getting variables 49116 1727204680.90260: in VariableManager get_vars() 49116 1727204680.90289: Calling all_inventory to load vars for managed-node3 49116 1727204680.90292: Calling groups_inventory to load vars for managed-node3 49116 1727204680.90295: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204680.90307: Calling all_plugins_play to load vars for managed-node3 49116 1727204680.90310: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204680.90326: Calling groups_plugins_play to load vars for managed-node3 49116 1727204680.90515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204680.90657: done with get_vars() 49116 1727204680.90667: done getting variables 49116 1727204680.90767: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 15:04:40 -0400 (0:00:00.038) 0:00:03.932 ***** 49116 1727204680.90790: entering _queue_task() for managed-node3/copy 49116 1727204680.91105: worker is 1 (out of 1 available) 49116 1727204680.91117: exiting _queue_task() for managed-node3/copy 49116 1727204680.91134: done queuing things up, now waiting for results queue to drain 49116 1727204680.91136: waiting for pending results... 49116 1727204680.91368: running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo 49116 1727204680.91440: in run() - task 127b8e07-fff9-02f7-957b-0000000000c5 49116 1727204680.91455: variable 'ansible_search_path' from source: unknown 49116 1727204680.91459: variable 'ansible_search_path' from source: unknown 49116 1727204680.91509: calling self._execute() 49116 1727204680.91564: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204680.91573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204680.91582: variable 'omit' from source: magic vars 49116 1727204680.91985: variable 'ansible_distribution' from source: facts 49116 1727204680.92004: Evaluated conditional (ansible_distribution == 'CentOS'): False 49116 1727204680.92008: when evaluation is False, skipping this task 49116 1727204680.92011: _execute() done 49116 1727204680.92014: dumping result to json 49116 1727204680.92017: done dumping result, returning 49116 1727204680.92020: done running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo [127b8e07-fff9-02f7-957b-0000000000c5] 49116 1727204680.92030: sending task result for task 127b8e07-fff9-02f7-957b-0000000000c5 49116 1727204680.92127: done sending task result for task 127b8e07-fff9-02f7-957b-0000000000c5 49116 1727204680.92140: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 49116 1727204680.92227: no more pending results, returning what we have 49116 1727204680.92231: results queue empty 49116 1727204680.92233: checking for any_errors_fatal 49116 1727204680.92237: done checking for any_errors_fatal 49116 1727204680.92238: checking for max_fail_percentage 49116 1727204680.92242: done checking for max_fail_percentage 49116 1727204680.92242: checking to see if all hosts have failed and the running result is not ok 49116 1727204680.92243: done checking to see if all hosts have failed 49116 1727204680.92244: getting the remaining hosts for this loop 49116 1727204680.92245: done getting the remaining hosts for this loop 49116 1727204680.92248: getting the next task for host managed-node3 49116 1727204680.92255: done getting next task for host managed-node3 49116 1727204680.92257: ^ task is: TASK: Include the task 'enable_epel.yml' 49116 1727204680.92260: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204680.92263: getting variables 49116 1727204680.92264: in VariableManager get_vars() 49116 1727204680.92292: Calling all_inventory to load vars for managed-node3 49116 1727204680.92295: Calling groups_inventory to load vars for managed-node3 49116 1727204680.92298: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204680.92309: Calling all_plugins_play to load vars for managed-node3 49116 1727204680.92312: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204680.92314: Calling groups_plugins_play to load vars for managed-node3 49116 1727204680.92480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204680.92646: done with get_vars() 49116 1727204680.92654: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 15:04:40 -0400 (0:00:00.019) 0:00:03.952 ***** 49116 1727204680.92740: entering _queue_task() for managed-node3/include_tasks 49116 1727204680.93039: worker is 1 (out of 1 available) 49116 1727204680.93053: exiting _queue_task() for managed-node3/include_tasks 49116 1727204680.93069: done queuing things up, now waiting for results queue to drain 49116 1727204680.93071: waiting for pending results... 49116 1727204680.93348: running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' 49116 1727204680.93408: in run() - task 127b8e07-fff9-02f7-957b-0000000000c6 49116 1727204680.93420: variable 'ansible_search_path' from source: unknown 49116 1727204680.93424: variable 'ansible_search_path' from source: unknown 49116 1727204680.93462: calling self._execute() 49116 1727204680.93522: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204680.93526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204680.93545: variable 'omit' from source: magic vars 49116 1727204680.94004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204680.96307: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204680.96367: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204680.96398: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204680.96426: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204680.96453: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204680.96523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204680.96549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204680.96569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204680.96601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204680.96612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204680.96714: variable '__network_is_ostree' from source: set_fact 49116 1727204680.96729: Evaluated conditional (not __network_is_ostree | d(false)): True 49116 1727204680.96738: _execute() done 49116 1727204680.96741: dumping result to json 49116 1727204680.96743: done dumping result, returning 49116 1727204680.96751: done running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' [127b8e07-fff9-02f7-957b-0000000000c6] 49116 1727204680.96757: sending task result for task 127b8e07-fff9-02f7-957b-0000000000c6 49116 1727204680.96857: done sending task result for task 127b8e07-fff9-02f7-957b-0000000000c6 49116 1727204680.96859: WORKER PROCESS EXITING 49116 1727204680.96894: no more pending results, returning what we have 49116 1727204680.96899: in VariableManager get_vars() 49116 1727204680.96933: Calling all_inventory to load vars for managed-node3 49116 1727204680.96936: Calling groups_inventory to load vars for managed-node3 49116 1727204680.96940: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204680.96953: Calling all_plugins_play to load vars for managed-node3 49116 1727204680.96955: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204680.96958: Calling groups_plugins_play to load vars for managed-node3 49116 1727204680.97148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204680.97303: done with get_vars() 49116 1727204680.97311: variable 'ansible_search_path' from source: unknown 49116 1727204680.97311: variable 'ansible_search_path' from source: unknown 49116 1727204680.97343: we have included files to process 49116 1727204680.97344: generating all_blocks data 49116 1727204680.97345: done generating all_blocks data 49116 1727204680.97350: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 49116 1727204680.97351: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 49116 1727204680.97352: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 49116 1727204680.97923: done processing included file 49116 1727204680.97925: iterating over new_blocks loaded from include file 49116 1727204680.97926: in VariableManager get_vars() 49116 1727204680.97938: done with get_vars() 49116 1727204680.97939: filtering new block on tags 49116 1727204680.97958: done filtering new block on tags 49116 1727204680.97961: in VariableManager get_vars() 49116 1727204680.97971: done with get_vars() 49116 1727204680.97972: filtering new block on tags 49116 1727204680.97980: done filtering new block on tags 49116 1727204680.97982: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node3 49116 1727204680.97994: extending task lists for all hosts with included blocks 49116 1727204680.98178: done extending task lists 49116 1727204680.98179: done processing included files 49116 1727204680.98180: results queue empty 49116 1727204680.98181: checking for any_errors_fatal 49116 1727204680.98187: done checking for any_errors_fatal 49116 1727204680.98188: checking for max_fail_percentage 49116 1727204680.98189: done checking for max_fail_percentage 49116 1727204680.98190: checking to see if all hosts have failed and the running result is not ok 49116 1727204680.98191: done checking to see if all hosts have failed 49116 1727204680.98192: getting the remaining hosts for this loop 49116 1727204680.98193: done getting the remaining hosts for this loop 49116 1727204680.98196: getting the next task for host managed-node3 49116 1727204680.98200: done getting next task for host managed-node3 49116 1727204680.98203: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 49116 1727204680.98206: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204680.98209: getting variables 49116 1727204680.98210: in VariableManager get_vars() 49116 1727204680.98217: Calling all_inventory to load vars for managed-node3 49116 1727204680.98219: Calling groups_inventory to load vars for managed-node3 49116 1727204680.98220: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204680.98225: Calling all_plugins_play to load vars for managed-node3 49116 1727204680.98230: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204680.98234: Calling groups_plugins_play to load vars for managed-node3 49116 1727204680.98353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204680.98593: done with get_vars() 49116 1727204680.98618: done getting variables 49116 1727204680.98774: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 49116 1727204680.99115: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 40] ********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 15:04:40 -0400 (0:00:00.064) 0:00:04.016 ***** 49116 1727204680.99155: entering _queue_task() for managed-node3/command 49116 1727204680.99156: Creating lock for command 49116 1727204680.99561: worker is 1 (out of 1 available) 49116 1727204680.99583: exiting _queue_task() for managed-node3/command 49116 1727204680.99600: done queuing things up, now waiting for results queue to drain 49116 1727204680.99601: waiting for pending results... 49116 1727204680.99789: running TaskExecutor() for managed-node3/TASK: Create EPEL 40 49116 1727204680.99871: in run() - task 127b8e07-fff9-02f7-957b-0000000000e0 49116 1727204680.99883: variable 'ansible_search_path' from source: unknown 49116 1727204680.99887: variable 'ansible_search_path' from source: unknown 49116 1727204680.99918: calling self._execute() 49116 1727204680.99986: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204680.99990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204681.00000: variable 'omit' from source: magic vars 49116 1727204681.00439: variable 'ansible_distribution' from source: facts 49116 1727204681.00456: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 49116 1727204681.00463: when evaluation is False, skipping this task 49116 1727204681.00472: _execute() done 49116 1727204681.00479: dumping result to json 49116 1727204681.00486: done dumping result, returning 49116 1727204681.00496: done running TaskExecutor() for managed-node3/TASK: Create EPEL 40 [127b8e07-fff9-02f7-957b-0000000000e0] 49116 1727204681.00505: sending task result for task 127b8e07-fff9-02f7-957b-0000000000e0 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 49116 1727204681.00831: no more pending results, returning what we have 49116 1727204681.00838: results queue empty 49116 1727204681.00839: checking for any_errors_fatal 49116 1727204681.00840: done checking for any_errors_fatal 49116 1727204681.00841: checking for max_fail_percentage 49116 1727204681.00843: done checking for max_fail_percentage 49116 1727204681.00844: checking to see if all hosts have failed and the running result is not ok 49116 1727204681.00845: done checking to see if all hosts have failed 49116 1727204681.00846: getting the remaining hosts for this loop 49116 1727204681.00848: done getting the remaining hosts for this loop 49116 1727204681.00852: getting the next task for host managed-node3 49116 1727204681.00862: done getting next task for host managed-node3 49116 1727204681.00866: ^ task is: TASK: Install yum-utils package 49116 1727204681.00871: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204681.00876: getting variables 49116 1727204681.00877: in VariableManager get_vars() 49116 1727204681.00914: Calling all_inventory to load vars for managed-node3 49116 1727204681.00917: Calling groups_inventory to load vars for managed-node3 49116 1727204681.00922: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204681.00941: Calling all_plugins_play to load vars for managed-node3 49116 1727204681.00946: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204681.00949: Calling groups_plugins_play to load vars for managed-node3 49116 1727204681.01343: done sending task result for task 127b8e07-fff9-02f7-957b-0000000000e0 49116 1727204681.01346: WORKER PROCESS EXITING 49116 1727204681.01376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204681.01645: done with get_vars() 49116 1727204681.01657: done getting variables 49116 1727204681.01773: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 15:04:41 -0400 (0:00:00.026) 0:00:04.043 ***** 49116 1727204681.01806: entering _queue_task() for managed-node3/package 49116 1727204681.01808: Creating lock for package 49116 1727204681.02189: worker is 1 (out of 1 available) 49116 1727204681.02205: exiting _queue_task() for managed-node3/package 49116 1727204681.02218: done queuing things up, now waiting for results queue to drain 49116 1727204681.02220: waiting for pending results... 49116 1727204681.02393: running TaskExecutor() for managed-node3/TASK: Install yum-utils package 49116 1727204681.02475: in run() - task 127b8e07-fff9-02f7-957b-0000000000e1 49116 1727204681.02485: variable 'ansible_search_path' from source: unknown 49116 1727204681.02490: variable 'ansible_search_path' from source: unknown 49116 1727204681.02523: calling self._execute() 49116 1727204681.02592: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204681.02596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204681.02606: variable 'omit' from source: magic vars 49116 1727204681.03073: variable 'ansible_distribution' from source: facts 49116 1727204681.03078: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 49116 1727204681.03080: when evaluation is False, skipping this task 49116 1727204681.03083: _execute() done 49116 1727204681.03086: dumping result to json 49116 1727204681.03088: done dumping result, returning 49116 1727204681.03091: done running TaskExecutor() for managed-node3/TASK: Install yum-utils package [127b8e07-fff9-02f7-957b-0000000000e1] 49116 1727204681.03093: sending task result for task 127b8e07-fff9-02f7-957b-0000000000e1 49116 1727204681.03180: done sending task result for task 127b8e07-fff9-02f7-957b-0000000000e1 49116 1727204681.03183: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 49116 1727204681.03253: no more pending results, returning what we have 49116 1727204681.03258: results queue empty 49116 1727204681.03259: checking for any_errors_fatal 49116 1727204681.03269: done checking for any_errors_fatal 49116 1727204681.03269: checking for max_fail_percentage 49116 1727204681.03272: done checking for max_fail_percentage 49116 1727204681.03273: checking to see if all hosts have failed and the running result is not ok 49116 1727204681.03274: done checking to see if all hosts have failed 49116 1727204681.03274: getting the remaining hosts for this loop 49116 1727204681.03276: done getting the remaining hosts for this loop 49116 1727204681.03281: getting the next task for host managed-node3 49116 1727204681.03289: done getting next task for host managed-node3 49116 1727204681.03291: ^ task is: TASK: Enable EPEL 7 49116 1727204681.03308: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204681.03313: getting variables 49116 1727204681.03315: in VariableManager get_vars() 49116 1727204681.03354: Calling all_inventory to load vars for managed-node3 49116 1727204681.03357: Calling groups_inventory to load vars for managed-node3 49116 1727204681.03362: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204681.03537: Calling all_plugins_play to load vars for managed-node3 49116 1727204681.03542: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204681.03547: Calling groups_plugins_play to load vars for managed-node3 49116 1727204681.03936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204681.04209: done with get_vars() 49116 1727204681.04232: done getting variables 49116 1727204681.04298: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 15:04:41 -0400 (0:00:00.025) 0:00:04.068 ***** 49116 1727204681.04350: entering _queue_task() for managed-node3/command 49116 1727204681.05227: worker is 1 (out of 1 available) 49116 1727204681.05239: exiting _queue_task() for managed-node3/command 49116 1727204681.05252: done queuing things up, now waiting for results queue to drain 49116 1727204681.05254: waiting for pending results... 49116 1727204681.05647: running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 49116 1727204681.05652: in run() - task 127b8e07-fff9-02f7-957b-0000000000e2 49116 1727204681.05656: variable 'ansible_search_path' from source: unknown 49116 1727204681.05658: variable 'ansible_search_path' from source: unknown 49116 1727204681.05680: calling self._execute() 49116 1727204681.05786: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204681.05850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204681.05854: variable 'omit' from source: magic vars 49116 1727204681.06292: variable 'ansible_distribution' from source: facts 49116 1727204681.06311: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 49116 1727204681.06318: when evaluation is False, skipping this task 49116 1727204681.06326: _execute() done 49116 1727204681.06332: dumping result to json 49116 1727204681.06345: done dumping result, returning 49116 1727204681.06371: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 [127b8e07-fff9-02f7-957b-0000000000e2] 49116 1727204681.06375: sending task result for task 127b8e07-fff9-02f7-957b-0000000000e2 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 49116 1727204681.06701: no more pending results, returning what we have 49116 1727204681.06705: results queue empty 49116 1727204681.06706: checking for any_errors_fatal 49116 1727204681.06723: done checking for any_errors_fatal 49116 1727204681.06725: checking for max_fail_percentage 49116 1727204681.06728: done checking for max_fail_percentage 49116 1727204681.06729: checking to see if all hosts have failed and the running result is not ok 49116 1727204681.06730: done checking to see if all hosts have failed 49116 1727204681.06731: getting the remaining hosts for this loop 49116 1727204681.06732: done getting the remaining hosts for this loop 49116 1727204681.06737: getting the next task for host managed-node3 49116 1727204681.06746: done getting next task for host managed-node3 49116 1727204681.06750: ^ task is: TASK: Enable EPEL 8 49116 1727204681.06755: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204681.06759: getting variables 49116 1727204681.06761: in VariableManager get_vars() 49116 1727204681.06858: Calling all_inventory to load vars for managed-node3 49116 1727204681.06861: Calling groups_inventory to load vars for managed-node3 49116 1727204681.06939: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204681.06954: Calling all_plugins_play to load vars for managed-node3 49116 1727204681.06958: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204681.06962: Calling groups_plugins_play to load vars for managed-node3 49116 1727204681.06981: done sending task result for task 127b8e07-fff9-02f7-957b-0000000000e2 49116 1727204681.06984: WORKER PROCESS EXITING 49116 1727204681.07484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204681.07930: done with get_vars() 49116 1727204681.07945: done getting variables 49116 1727204681.08193: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 15:04:41 -0400 (0:00:00.038) 0:00:04.107 ***** 49116 1727204681.08229: entering _queue_task() for managed-node3/command 49116 1727204681.09064: worker is 1 (out of 1 available) 49116 1727204681.09137: exiting _queue_task() for managed-node3/command 49116 1727204681.09151: done queuing things up, now waiting for results queue to drain 49116 1727204681.09153: waiting for pending results... 49116 1727204681.09559: running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 49116 1727204681.09749: in run() - task 127b8e07-fff9-02f7-957b-0000000000e3 49116 1727204681.10271: variable 'ansible_search_path' from source: unknown 49116 1727204681.10281: variable 'ansible_search_path' from source: unknown 49116 1727204681.10353: calling self._execute() 49116 1727204681.10463: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204681.10480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204681.10496: variable 'omit' from source: magic vars 49116 1727204681.10961: variable 'ansible_distribution' from source: facts 49116 1727204681.10989: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 49116 1727204681.10998: when evaluation is False, skipping this task 49116 1727204681.11007: _execute() done 49116 1727204681.11061: dumping result to json 49116 1727204681.11064: done dumping result, returning 49116 1727204681.11069: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 [127b8e07-fff9-02f7-957b-0000000000e3] 49116 1727204681.11072: sending task result for task 127b8e07-fff9-02f7-957b-0000000000e3 49116 1727204681.11378: done sending task result for task 127b8e07-fff9-02f7-957b-0000000000e3 49116 1727204681.11381: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 49116 1727204681.11427: no more pending results, returning what we have 49116 1727204681.11431: results queue empty 49116 1727204681.11432: checking for any_errors_fatal 49116 1727204681.11439: done checking for any_errors_fatal 49116 1727204681.11440: checking for max_fail_percentage 49116 1727204681.11442: done checking for max_fail_percentage 49116 1727204681.11443: checking to see if all hosts have failed and the running result is not ok 49116 1727204681.11444: done checking to see if all hosts have failed 49116 1727204681.11445: getting the remaining hosts for this loop 49116 1727204681.11446: done getting the remaining hosts for this loop 49116 1727204681.11450: getting the next task for host managed-node3 49116 1727204681.11460: done getting next task for host managed-node3 49116 1727204681.11462: ^ task is: TASK: Enable EPEL 6 49116 1727204681.11469: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204681.11473: getting variables 49116 1727204681.11474: in VariableManager get_vars() 49116 1727204681.11507: Calling all_inventory to load vars for managed-node3 49116 1727204681.11510: Calling groups_inventory to load vars for managed-node3 49116 1727204681.11514: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204681.11527: Calling all_plugins_play to load vars for managed-node3 49116 1727204681.11530: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204681.11533: Calling groups_plugins_play to load vars for managed-node3 49116 1727204681.11951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204681.12454: done with get_vars() 49116 1727204681.12473: done getting variables 49116 1727204681.12545: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 15:04:41 -0400 (0:00:00.044) 0:00:04.151 ***** 49116 1727204681.12702: entering _queue_task() for managed-node3/copy 49116 1727204681.13718: worker is 1 (out of 1 available) 49116 1727204681.13729: exiting _queue_task() for managed-node3/copy 49116 1727204681.13749: done queuing things up, now waiting for results queue to drain 49116 1727204681.13750: waiting for pending results... 49116 1727204681.14315: running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 49116 1727204681.14632: in run() - task 127b8e07-fff9-02f7-957b-0000000000e5 49116 1727204681.14726: variable 'ansible_search_path' from source: unknown 49116 1727204681.14730: variable 'ansible_search_path' from source: unknown 49116 1727204681.14738: calling self._execute() 49116 1727204681.15178: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204681.15184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204681.15187: variable 'omit' from source: magic vars 49116 1727204681.15915: variable 'ansible_distribution' from source: facts 49116 1727204681.16092: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 49116 1727204681.16101: when evaluation is False, skipping this task 49116 1727204681.16110: _execute() done 49116 1727204681.16117: dumping result to json 49116 1727204681.16124: done dumping result, returning 49116 1727204681.16141: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 [127b8e07-fff9-02f7-957b-0000000000e5] 49116 1727204681.16152: sending task result for task 127b8e07-fff9-02f7-957b-0000000000e5 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 49116 1727204681.16332: no more pending results, returning what we have 49116 1727204681.16336: results queue empty 49116 1727204681.16336: checking for any_errors_fatal 49116 1727204681.16341: done checking for any_errors_fatal 49116 1727204681.16342: checking for max_fail_percentage 49116 1727204681.16344: done checking for max_fail_percentage 49116 1727204681.16345: checking to see if all hosts have failed and the running result is not ok 49116 1727204681.16345: done checking to see if all hosts have failed 49116 1727204681.16346: getting the remaining hosts for this loop 49116 1727204681.16347: done getting the remaining hosts for this loop 49116 1727204681.16352: getting the next task for host managed-node3 49116 1727204681.16361: done getting next task for host managed-node3 49116 1727204681.16364: ^ task is: TASK: Set network provider to 'nm' 49116 1727204681.16490: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204681.16496: getting variables 49116 1727204681.16498: in VariableManager get_vars() 49116 1727204681.16527: Calling all_inventory to load vars for managed-node3 49116 1727204681.16530: Calling groups_inventory to load vars for managed-node3 49116 1727204681.16533: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204681.16546: Calling all_plugins_play to load vars for managed-node3 49116 1727204681.16549: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204681.16552: Calling groups_plugins_play to load vars for managed-node3 49116 1727204681.16880: done sending task result for task 127b8e07-fff9-02f7-957b-0000000000e5 49116 1727204681.16884: WORKER PROCESS EXITING 49116 1727204681.16908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204681.17142: done with get_vars() 49116 1727204681.17155: done getting variables 49116 1727204681.17221: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:13 Tuesday 24 September 2024 15:04:41 -0400 (0:00:00.045) 0:00:04.197 ***** 49116 1727204681.17255: entering _queue_task() for managed-node3/set_fact 49116 1727204681.17604: worker is 1 (out of 1 available) 49116 1727204681.17617: exiting _queue_task() for managed-node3/set_fact 49116 1727204681.17636: done queuing things up, now waiting for results queue to drain 49116 1727204681.17637: waiting for pending results... 49116 1727204681.17915: running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' 49116 1727204681.18037: in run() - task 127b8e07-fff9-02f7-957b-000000000007 49116 1727204681.18093: variable 'ansible_search_path' from source: unknown 49116 1727204681.18115: calling self._execute() 49116 1727204681.18217: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204681.18229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204681.18309: variable 'omit' from source: magic vars 49116 1727204681.18380: variable 'omit' from source: magic vars 49116 1727204681.18427: variable 'omit' from source: magic vars 49116 1727204681.18477: variable 'omit' from source: magic vars 49116 1727204681.18540: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204681.18594: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204681.18620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204681.18652: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204681.18682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204681.18927: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204681.18931: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204681.18936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204681.18983: Set connection var ansible_connection to ssh 49116 1727204681.19016: Set connection var ansible_timeout to 10 49116 1727204681.19028: Set connection var ansible_shell_executable to /bin/sh 49116 1727204681.19041: Set connection var ansible_pipelining to False 49116 1727204681.19048: Set connection var ansible_shell_type to sh 49116 1727204681.19057: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204681.19091: variable 'ansible_shell_executable' from source: unknown 49116 1727204681.19099: variable 'ansible_connection' from source: unknown 49116 1727204681.19107: variable 'ansible_module_compression' from source: unknown 49116 1727204681.19113: variable 'ansible_shell_type' from source: unknown 49116 1727204681.19171: variable 'ansible_shell_executable' from source: unknown 49116 1727204681.19174: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204681.19179: variable 'ansible_pipelining' from source: unknown 49116 1727204681.19184: variable 'ansible_timeout' from source: unknown 49116 1727204681.19186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204681.19319: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204681.19342: variable 'omit' from source: magic vars 49116 1727204681.19354: starting attempt loop 49116 1727204681.19360: running the handler 49116 1727204681.19380: handler run complete 49116 1727204681.19401: attempt loop complete, returning result 49116 1727204681.19408: _execute() done 49116 1727204681.19415: dumping result to json 49116 1727204681.19484: done dumping result, returning 49116 1727204681.19487: done running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' [127b8e07-fff9-02f7-957b-000000000007] 49116 1727204681.19490: sending task result for task 127b8e07-fff9-02f7-957b-000000000007 49116 1727204681.19673: done sending task result for task 127b8e07-fff9-02f7-957b-000000000007 49116 1727204681.19676: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 49116 1727204681.19742: no more pending results, returning what we have 49116 1727204681.19746: results queue empty 49116 1727204681.19747: checking for any_errors_fatal 49116 1727204681.19754: done checking for any_errors_fatal 49116 1727204681.19755: checking for max_fail_percentage 49116 1727204681.19757: done checking for max_fail_percentage 49116 1727204681.19758: checking to see if all hosts have failed and the running result is not ok 49116 1727204681.19759: done checking to see if all hosts have failed 49116 1727204681.19760: getting the remaining hosts for this loop 49116 1727204681.19761: done getting the remaining hosts for this loop 49116 1727204681.19768: getting the next task for host managed-node3 49116 1727204681.19776: done getting next task for host managed-node3 49116 1727204681.19779: ^ task is: TASK: meta (flush_handlers) 49116 1727204681.19781: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204681.19786: getting variables 49116 1727204681.19788: in VariableManager get_vars() 49116 1727204681.19824: Calling all_inventory to load vars for managed-node3 49116 1727204681.19827: Calling groups_inventory to load vars for managed-node3 49116 1727204681.19831: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204681.19847: Calling all_plugins_play to load vars for managed-node3 49116 1727204681.19850: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204681.19853: Calling groups_plugins_play to load vars for managed-node3 49116 1727204681.20228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204681.20478: done with get_vars() 49116 1727204681.20492: done getting variables 49116 1727204681.20570: in VariableManager get_vars() 49116 1727204681.20581: Calling all_inventory to load vars for managed-node3 49116 1727204681.20583: Calling groups_inventory to load vars for managed-node3 49116 1727204681.20585: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204681.20591: Calling all_plugins_play to load vars for managed-node3 49116 1727204681.20593: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204681.20595: Calling groups_plugins_play to load vars for managed-node3 49116 1727204681.20980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204681.21180: done with get_vars() 49116 1727204681.21197: done queuing things up, now waiting for results queue to drain 49116 1727204681.21199: results queue empty 49116 1727204681.21200: checking for any_errors_fatal 49116 1727204681.21203: done checking for any_errors_fatal 49116 1727204681.21203: checking for max_fail_percentage 49116 1727204681.21204: done checking for max_fail_percentage 49116 1727204681.21205: checking to see if all hosts have failed and the running result is not ok 49116 1727204681.21206: done checking to see if all hosts have failed 49116 1727204681.21207: getting the remaining hosts for this loop 49116 1727204681.21208: done getting the remaining hosts for this loop 49116 1727204681.21210: getting the next task for host managed-node3 49116 1727204681.21215: done getting next task for host managed-node3 49116 1727204681.21217: ^ task is: TASK: meta (flush_handlers) 49116 1727204681.21218: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204681.21227: getting variables 49116 1727204681.21228: in VariableManager get_vars() 49116 1727204681.21240: Calling all_inventory to load vars for managed-node3 49116 1727204681.21243: Calling groups_inventory to load vars for managed-node3 49116 1727204681.21245: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204681.21252: Calling all_plugins_play to load vars for managed-node3 49116 1727204681.21254: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204681.21257: Calling groups_plugins_play to load vars for managed-node3 49116 1727204681.21406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204681.21635: done with get_vars() 49116 1727204681.21645: done getting variables 49116 1727204681.21703: in VariableManager get_vars() 49116 1727204681.21714: Calling all_inventory to load vars for managed-node3 49116 1727204681.21716: Calling groups_inventory to load vars for managed-node3 49116 1727204681.21719: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204681.21725: Calling all_plugins_play to load vars for managed-node3 49116 1727204681.21727: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204681.21730: Calling groups_plugins_play to load vars for managed-node3 49116 1727204681.21941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204681.22182: done with get_vars() 49116 1727204681.22196: done queuing things up, now waiting for results queue to drain 49116 1727204681.22198: results queue empty 49116 1727204681.22199: checking for any_errors_fatal 49116 1727204681.22200: done checking for any_errors_fatal 49116 1727204681.22201: checking for max_fail_percentage 49116 1727204681.22202: done checking for max_fail_percentage 49116 1727204681.22203: checking to see if all hosts have failed and the running result is not ok 49116 1727204681.22204: done checking to see if all hosts have failed 49116 1727204681.22205: getting the remaining hosts for this loop 49116 1727204681.22206: done getting the remaining hosts for this loop 49116 1727204681.22209: getting the next task for host managed-node3 49116 1727204681.22212: done getting next task for host managed-node3 49116 1727204681.22213: ^ task is: None 49116 1727204681.22215: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204681.22216: done queuing things up, now waiting for results queue to drain 49116 1727204681.22217: results queue empty 49116 1727204681.22218: checking for any_errors_fatal 49116 1727204681.22218: done checking for any_errors_fatal 49116 1727204681.22219: checking for max_fail_percentage 49116 1727204681.22220: done checking for max_fail_percentage 49116 1727204681.22221: checking to see if all hosts have failed and the running result is not ok 49116 1727204681.22221: done checking to see if all hosts have failed 49116 1727204681.22223: getting the next task for host managed-node3 49116 1727204681.22225: done getting next task for host managed-node3 49116 1727204681.22226: ^ task is: None 49116 1727204681.22228: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204681.22287: in VariableManager get_vars() 49116 1727204681.22316: done with get_vars() 49116 1727204681.22323: in VariableManager get_vars() 49116 1727204681.22342: done with get_vars() 49116 1727204681.22347: variable 'omit' from source: magic vars 49116 1727204681.22383: in VariableManager get_vars() 49116 1727204681.22401: done with get_vars() 49116 1727204681.22444: variable 'omit' from source: magic vars PLAY [Play for testing vlan mtu setting] *************************************** 49116 1727204681.22890: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 49116 1727204681.22922: getting the remaining hosts for this loop 49116 1727204681.22924: done getting the remaining hosts for this loop 49116 1727204681.22927: getting the next task for host managed-node3 49116 1727204681.22930: done getting next task for host managed-node3 49116 1727204681.22935: ^ task is: TASK: Gathering Facts 49116 1727204681.22937: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204681.22939: getting variables 49116 1727204681.22940: in VariableManager get_vars() 49116 1727204681.22956: Calling all_inventory to load vars for managed-node3 49116 1727204681.22958: Calling groups_inventory to load vars for managed-node3 49116 1727204681.22960: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204681.22969: Calling all_plugins_play to load vars for managed-node3 49116 1727204681.22985: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204681.22989: Calling groups_plugins_play to load vars for managed-node3 49116 1727204681.23180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204681.23437: done with get_vars() 49116 1727204681.23447: done getting variables 49116 1727204681.23497: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:3 Tuesday 24 September 2024 15:04:41 -0400 (0:00:00.062) 0:00:04.260 ***** 49116 1727204681.23527: entering _queue_task() for managed-node3/gather_facts 49116 1727204681.23872: worker is 1 (out of 1 available) 49116 1727204681.23886: exiting _queue_task() for managed-node3/gather_facts 49116 1727204681.23899: done queuing things up, now waiting for results queue to drain 49116 1727204681.23900: waiting for pending results... 49116 1727204681.24495: running TaskExecutor() for managed-node3/TASK: Gathering Facts 49116 1727204681.24619: in run() - task 127b8e07-fff9-02f7-957b-00000000010b 49116 1727204681.24813: variable 'ansible_search_path' from source: unknown 49116 1727204681.24818: calling self._execute() 49116 1727204681.24982: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204681.25040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204681.25056: variable 'omit' from source: magic vars 49116 1727204681.25728: variable 'ansible_distribution_major_version' from source: facts 49116 1727204681.25753: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204681.25764: variable 'omit' from source: magic vars 49116 1727204681.25806: variable 'omit' from source: magic vars 49116 1727204681.25853: variable 'omit' from source: magic vars 49116 1727204681.25907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204681.26009: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204681.26012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204681.26123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204681.26143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204681.26182: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204681.26190: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204681.26197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204681.26308: Set connection var ansible_connection to ssh 49116 1727204681.26329: Set connection var ansible_timeout to 10 49116 1727204681.26350: Set connection var ansible_shell_executable to /bin/sh 49116 1727204681.26360: Set connection var ansible_pipelining to False 49116 1727204681.26450: Set connection var ansible_shell_type to sh 49116 1727204681.26453: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204681.26455: variable 'ansible_shell_executable' from source: unknown 49116 1727204681.26457: variable 'ansible_connection' from source: unknown 49116 1727204681.26459: variable 'ansible_module_compression' from source: unknown 49116 1727204681.26461: variable 'ansible_shell_type' from source: unknown 49116 1727204681.26463: variable 'ansible_shell_executable' from source: unknown 49116 1727204681.26467: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204681.26469: variable 'ansible_pipelining' from source: unknown 49116 1727204681.26472: variable 'ansible_timeout' from source: unknown 49116 1727204681.26473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204681.26647: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204681.26672: variable 'omit' from source: magic vars 49116 1727204681.26683: starting attempt loop 49116 1727204681.26690: running the handler 49116 1727204681.26709: variable 'ansible_facts' from source: unknown 49116 1727204681.26731: _low_level_execute_command(): starting 49116 1727204681.26746: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204681.27525: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204681.27550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204681.27671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204681.27693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204681.27808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 49116 1727204681.30402: stdout chunk (state=3): >>>/root <<< 49116 1727204681.30662: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204681.30902: stdout chunk (state=3): >>><<< 49116 1727204681.30907: stderr chunk (state=3): >>><<< 49116 1727204681.30910: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 49116 1727204681.30913: _low_level_execute_command(): starting 49116 1727204681.30916: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204681.307953-49404-173252167224136 `" && echo ansible-tmp-1727204681.307953-49404-173252167224136="` echo /root/.ansible/tmp/ansible-tmp-1727204681.307953-49404-173252167224136 `" ) && sleep 0' 49116 1727204681.32012: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204681.32034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204681.32057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204681.32169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204681.32199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204681.32228: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204681.32250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204681.32461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 49116 1727204681.35442: stdout chunk (state=3): >>>ansible-tmp-1727204681.307953-49404-173252167224136=/root/.ansible/tmp/ansible-tmp-1727204681.307953-49404-173252167224136 <<< 49116 1727204681.35696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204681.35706: stdout chunk (state=3): >>><<< 49116 1727204681.35722: stderr chunk (state=3): >>><<< 49116 1727204681.35752: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204681.307953-49404-173252167224136=/root/.ansible/tmp/ansible-tmp-1727204681.307953-49404-173252167224136 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 49116 1727204681.35796: variable 'ansible_module_compression' from source: unknown 49116 1727204681.35934: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 49116 1727204681.35957: variable 'ansible_facts' from source: unknown 49116 1727204681.36185: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204681.307953-49404-173252167224136/AnsiballZ_setup.py 49116 1727204681.36417: Sending initial data 49116 1727204681.36420: Sent initial data (153 bytes) 49116 1727204681.37491: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204681.37719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204681.37834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 49116 1727204681.40350: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204681.40459: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204681.40571: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmp4cp2bfsh /root/.ansible/tmp/ansible-tmp-1727204681.307953-49404-173252167224136/AnsiballZ_setup.py <<< 49116 1727204681.40584: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204681.307953-49404-173252167224136/AnsiballZ_setup.py" <<< 49116 1727204681.40676: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmp4cp2bfsh" to remote "/root/.ansible/tmp/ansible-tmp-1727204681.307953-49404-173252167224136/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204681.307953-49404-173252167224136/AnsiballZ_setup.py" <<< 49116 1727204681.43842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204681.43846: stdout chunk (state=3): >>><<< 49116 1727204681.43849: stderr chunk (state=3): >>><<< 49116 1727204681.43851: done transferring module to remote 49116 1727204681.43854: _low_level_execute_command(): starting 49116 1727204681.43856: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204681.307953-49404-173252167224136/ /root/.ansible/tmp/ansible-tmp-1727204681.307953-49404-173252167224136/AnsiballZ_setup.py && sleep 0' 49116 1727204681.44617: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204681.44635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204681.44678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204681.44697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204681.44776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 49116 1727204681.47574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204681.47647: stderr chunk (state=3): >>><<< 49116 1727204681.47652: stdout chunk (state=3): >>><<< 49116 1727204681.47758: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 49116 1727204681.47770: _low_level_execute_command(): starting 49116 1727204681.47773: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204681.307953-49404-173252167224136/AnsiballZ_setup.py && sleep 0' 49116 1727204681.48391: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204681.48472: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204681.48486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204681.48555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204681.48560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204681.48634: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 49116 1727204682.40722: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3005, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 711, "free": 3005}, "nocache": {"free": 3458, "used": 258}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansibl<<< 49116 1727204682.40730: stdout chunk (state=3): >>>e_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1019, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251300900864, "block_size": 4096, "block_total": 64479564, "block_available": 61352759, "block_used": 3126805, "inode_total": 16384000, "inode_available": 16301236, "inode_used": 82764, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.5380859375, "5m": 0.60302734375, "15m": 0.4326171875}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "04", "second": "42", "epoch": "1727204682", "epoch_int": "1727204682", "date": "2024-09-24", "time": "15:04:42", "iso8601_micro": "2024-09-24T19:04:42.350829Z", "iso8601": "2024-09-24T19:04:42Z", "iso8601_basic": "20240924T150442350829", "iso8601_basic_short": "20240924T150442", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169"], "ansible_all_ipv6_addresses": ["fe80::aa:78ff:fea8:9b13"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::aa:78ff:fea8:9b13"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 49116 1727204682.42693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204682.42852: stderr chunk (state=3): >>>Shared connection to 10.31.45.169 closed. <<< 49116 1727204682.42856: stderr chunk (state=3): >>><<< 49116 1727204682.43171: stdout chunk (state=3): >>><<< 49116 1727204682.43178: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 39754 10.31.45.169 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 39754 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2f45753f4562ec5cb5af974e0ba4b7", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDMno02CCUpdfdqhSLw3DEMgRB2qjjltGvjvvCA4FaqYC6LCLimL/modyFwZBTr6mod5aVbEv6eVT3cFEFELkt0kQvtOziptiFW5YZ0dlqvF004nHv7tpUqUboaKXf3hY9kfDIHOuUKZOV1AH7UTuNGixxTuYXFQ+fG7hLGh4Vep864Qk6wN5hv56JDtXEzMMB7xxbnEU6nTFIA8TIX+aYYYxIipVJjI+TR9J9VhQf+oNDJhhqqHyobnqG5WTt3jEYQo+8cWC4B8LegOCwae4jpCrLwhKnzmvV787NTqy90vgHgain4GhWTCKI+2dFsqryKBgKIBuENXOpmpRGT4gqBQHbc5v/vxWqYoPPhg1Wb8R+WRueYbdol4I10CveFNShlWCSRLSu/vOutS0xtU3WEIQFs2Mn06Aqq6bMoG70EJ9bJOEQ82f23VIkVKoW1hmcKHTCrMv715oNONo08LOTQkBYDv3MQpAtFQnpuIVPlAXYu1spAx3i2i31866ukCUE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG5WDcSyqLv17rg+6P3+2pjKf2x2X+Jf8yHGACagVeIm/l8LWG2NszXTHOdaZlbD4aes7hBRe0B7oCa8ilqHGf0=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII5o3yNikV31ncy7je2EsNwog36vbYT7D9w98r4ZeD7x", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3005, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 711, "free": 3005}, "nocache": {"free": 3458, "used": 258}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_uuid": "ec2f4575-3f45-62ec-5cb5-af974e0ba4b7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1019, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251300900864, "block_size": 4096, "block_total": 64479564, "block_available": 61352759, "block_used": 3126805, "inode_total": 16384000, "inode_available": 16301236, "inode_used": 82764, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.5380859375, "5m": 0.60302734375, "15m": 0.4326171875}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "04", "second": "42", "epoch": "1727204682", "epoch_int": "1727204682", "date": "2024-09-24", "time": "15:04:42", "iso8601_micro": "2024-09-24T19:04:42.350829Z", "iso8601": "2024-09-24T19:04:42Z", "iso8601_basic": "20240924T150442350829", "iso8601_basic_short": "20240924T150442", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::aa:78ff:fea8:9b13", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.169", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:aa:78:a8:9b:13", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.169"], "ansible_all_ipv6_addresses": ["fe80::aa:78ff:fea8:9b13"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.169", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::aa:78ff:fea8:9b13"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204682.43911: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204681.307953-49404-173252167224136/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204682.43916: _low_level_execute_command(): starting 49116 1727204682.43919: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204681.307953-49404-173252167224136/ > /dev/null 2>&1 && sleep 0' 49116 1727204682.45089: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204682.45294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204682.45511: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204682.45590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 49116 1727204682.47986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204682.48000: stdout chunk (state=3): >>><<< 49116 1727204682.48012: stderr chunk (state=3): >>><<< 49116 1727204682.48043: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 49116 1727204682.48069: handler run complete 49116 1727204682.48478: variable 'ansible_facts' from source: unknown 49116 1727204682.48598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204682.49362: variable 'ansible_facts' from source: unknown 49116 1727204682.49674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204682.50035: attempt loop complete, returning result 49116 1727204682.50047: _execute() done 49116 1727204682.50054: dumping result to json 49116 1727204682.50371: done dumping result, returning 49116 1727204682.50375: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [127b8e07-fff9-02f7-957b-00000000010b] 49116 1727204682.50377: sending task result for task 127b8e07-fff9-02f7-957b-00000000010b 49116 1727204682.51449: done sending task result for task 127b8e07-fff9-02f7-957b-00000000010b 49116 1727204682.51452: WORKER PROCESS EXITING ok: [managed-node3] 49116 1727204682.52237: no more pending results, returning what we have 49116 1727204682.52240: results queue empty 49116 1727204682.52241: checking for any_errors_fatal 49116 1727204682.52243: done checking for any_errors_fatal 49116 1727204682.52243: checking for max_fail_percentage 49116 1727204682.52245: done checking for max_fail_percentage 49116 1727204682.52246: checking to see if all hosts have failed and the running result is not ok 49116 1727204682.52246: done checking to see if all hosts have failed 49116 1727204682.52247: getting the remaining hosts for this loop 49116 1727204682.52248: done getting the remaining hosts for this loop 49116 1727204682.52253: getting the next task for host managed-node3 49116 1727204682.52259: done getting next task for host managed-node3 49116 1727204682.52261: ^ task is: TASK: meta (flush_handlers) 49116 1727204682.52262: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204682.52287: getting variables 49116 1727204682.52289: in VariableManager get_vars() 49116 1727204682.52322: Calling all_inventory to load vars for managed-node3 49116 1727204682.52325: Calling groups_inventory to load vars for managed-node3 49116 1727204682.52328: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204682.52343: Calling all_plugins_play to load vars for managed-node3 49116 1727204682.52346: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204682.52349: Calling groups_plugins_play to load vars for managed-node3 49116 1727204682.52657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204682.53273: done with get_vars() 49116 1727204682.53288: done getting variables 49116 1727204682.53481: in VariableManager get_vars() 49116 1727204682.53500: Calling all_inventory to load vars for managed-node3 49116 1727204682.53503: Calling groups_inventory to load vars for managed-node3 49116 1727204682.53505: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204682.53516: Calling all_plugins_play to load vars for managed-node3 49116 1727204682.53519: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204682.53522: Calling groups_plugins_play to load vars for managed-node3 49116 1727204682.54040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204682.54735: done with get_vars() 49116 1727204682.54760: done queuing things up, now waiting for results queue to drain 49116 1727204682.54763: results queue empty 49116 1727204682.54764: checking for any_errors_fatal 49116 1727204682.54771: done checking for any_errors_fatal 49116 1727204682.54772: checking for max_fail_percentage 49116 1727204682.54773: done checking for max_fail_percentage 49116 1727204682.54774: checking to see if all hosts have failed and the running result is not ok 49116 1727204682.54774: done checking to see if all hosts have failed 49116 1727204682.54780: getting the remaining hosts for this loop 49116 1727204682.54781: done getting the remaining hosts for this loop 49116 1727204682.54785: getting the next task for host managed-node3 49116 1727204682.54789: done getting next task for host managed-node3 49116 1727204682.54792: ^ task is: TASK: Include the task 'show_interfaces.yml' 49116 1727204682.54793: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204682.54796: getting variables 49116 1727204682.54797: in VariableManager get_vars() 49116 1727204682.54815: Calling all_inventory to load vars for managed-node3 49116 1727204682.54818: Calling groups_inventory to load vars for managed-node3 49116 1727204682.54820: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204682.54826: Calling all_plugins_play to load vars for managed-node3 49116 1727204682.54828: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204682.54949: Calling groups_plugins_play to load vars for managed-node3 49116 1727204682.55259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204682.55806: done with get_vars() 49116 1727204682.55939: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:10 Tuesday 24 September 2024 15:04:42 -0400 (0:00:01.325) 0:00:05.585 ***** 49116 1727204682.56136: entering _queue_task() for managed-node3/include_tasks 49116 1727204682.57322: worker is 1 (out of 1 available) 49116 1727204682.57338: exiting _queue_task() for managed-node3/include_tasks 49116 1727204682.57350: done queuing things up, now waiting for results queue to drain 49116 1727204682.57352: waiting for pending results... 49116 1727204682.58212: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 49116 1727204682.58405: in run() - task 127b8e07-fff9-02f7-957b-00000000000b 49116 1727204682.58419: variable 'ansible_search_path' from source: unknown 49116 1727204682.58461: calling self._execute() 49116 1727204682.58554: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204682.58558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204682.59079: variable 'omit' from source: magic vars 49116 1727204682.60006: variable 'ansible_distribution_major_version' from source: facts 49116 1727204682.60011: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204682.60015: _execute() done 49116 1727204682.60018: dumping result to json 49116 1727204682.60020: done dumping result, returning 49116 1727204682.60023: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [127b8e07-fff9-02f7-957b-00000000000b] 49116 1727204682.60025: sending task result for task 127b8e07-fff9-02f7-957b-00000000000b 49116 1727204682.60261: no more pending results, returning what we have 49116 1727204682.60270: in VariableManager get_vars() 49116 1727204682.60320: Calling all_inventory to load vars for managed-node3 49116 1727204682.60471: Calling groups_inventory to load vars for managed-node3 49116 1727204682.60475: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204682.60492: Calling all_plugins_play to load vars for managed-node3 49116 1727204682.60496: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204682.60499: Calling groups_plugins_play to load vars for managed-node3 49116 1727204682.60903: done sending task result for task 127b8e07-fff9-02f7-957b-00000000000b 49116 1727204682.60907: WORKER PROCESS EXITING 49116 1727204682.60935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204682.61488: done with get_vars() 49116 1727204682.61511: variable 'ansible_search_path' from source: unknown 49116 1727204682.61528: we have included files to process 49116 1727204682.61530: generating all_blocks data 49116 1727204682.61531: done generating all_blocks data 49116 1727204682.61534: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 49116 1727204682.61535: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 49116 1727204682.61538: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 49116 1727204682.61917: in VariableManager get_vars() 49116 1727204682.62060: done with get_vars() 49116 1727204682.62313: done processing included file 49116 1727204682.62315: iterating over new_blocks loaded from include file 49116 1727204682.62317: in VariableManager get_vars() 49116 1727204682.62339: done with get_vars() 49116 1727204682.62341: filtering new block on tags 49116 1727204682.62360: done filtering new block on tags 49116 1727204682.62481: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 49116 1727204682.62488: extending task lists for all hosts with included blocks 49116 1727204682.68339: done extending task lists 49116 1727204682.68341: done processing included files 49116 1727204682.68342: results queue empty 49116 1727204682.68343: checking for any_errors_fatal 49116 1727204682.68345: done checking for any_errors_fatal 49116 1727204682.68346: checking for max_fail_percentage 49116 1727204682.68348: done checking for max_fail_percentage 49116 1727204682.68348: checking to see if all hosts have failed and the running result is not ok 49116 1727204682.68349: done checking to see if all hosts have failed 49116 1727204682.68350: getting the remaining hosts for this loop 49116 1727204682.68351: done getting the remaining hosts for this loop 49116 1727204682.68354: getting the next task for host managed-node3 49116 1727204682.68359: done getting next task for host managed-node3 49116 1727204682.68361: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 49116 1727204682.68364: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204682.68468: getting variables 49116 1727204682.68471: in VariableManager get_vars() 49116 1727204682.68492: Calling all_inventory to load vars for managed-node3 49116 1727204682.68495: Calling groups_inventory to load vars for managed-node3 49116 1727204682.68497: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204682.68506: Calling all_plugins_play to load vars for managed-node3 49116 1727204682.68509: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204682.68511: Calling groups_plugins_play to load vars for managed-node3 49116 1727204682.68928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204682.69419: done with get_vars() 49116 1727204682.69437: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:04:42 -0400 (0:00:00.137) 0:00:05.722 ***** 49116 1727204682.69757: entering _queue_task() for managed-node3/include_tasks 49116 1727204682.70740: worker is 1 (out of 1 available) 49116 1727204682.70756: exiting _queue_task() for managed-node3/include_tasks 49116 1727204682.70884: done queuing things up, now waiting for results queue to drain 49116 1727204682.70887: waiting for pending results... 49116 1727204682.71368: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 49116 1727204682.71642: in run() - task 127b8e07-fff9-02f7-957b-000000000120 49116 1727204682.71793: variable 'ansible_search_path' from source: unknown 49116 1727204682.71797: variable 'ansible_search_path' from source: unknown 49116 1727204682.71800: calling self._execute() 49116 1727204682.72010: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204682.72014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204682.72060: variable 'omit' from source: magic vars 49116 1727204682.73254: variable 'ansible_distribution_major_version' from source: facts 49116 1727204682.73259: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204682.73262: _execute() done 49116 1727204682.73264: dumping result to json 49116 1727204682.73268: done dumping result, returning 49116 1727204682.73271: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [127b8e07-fff9-02f7-957b-000000000120] 49116 1727204682.73274: sending task result for task 127b8e07-fff9-02f7-957b-000000000120 49116 1727204682.73419: no more pending results, returning what we have 49116 1727204682.73427: in VariableManager get_vars() 49116 1727204682.73489: Calling all_inventory to load vars for managed-node3 49116 1727204682.73493: Calling groups_inventory to load vars for managed-node3 49116 1727204682.73495: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204682.73514: Calling all_plugins_play to load vars for managed-node3 49116 1727204682.73518: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204682.73521: Calling groups_plugins_play to load vars for managed-node3 49116 1727204682.74425: done sending task result for task 127b8e07-fff9-02f7-957b-000000000120 49116 1727204682.74429: WORKER PROCESS EXITING 49116 1727204682.74486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204682.75140: done with get_vars() 49116 1727204682.75152: variable 'ansible_search_path' from source: unknown 49116 1727204682.75153: variable 'ansible_search_path' from source: unknown 49116 1727204682.75207: we have included files to process 49116 1727204682.75209: generating all_blocks data 49116 1727204682.75210: done generating all_blocks data 49116 1727204682.75212: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 49116 1727204682.75213: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 49116 1727204682.75216: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 49116 1727204682.76134: done processing included file 49116 1727204682.76137: iterating over new_blocks loaded from include file 49116 1727204682.76139: in VariableManager get_vars() 49116 1727204682.76164: done with get_vars() 49116 1727204682.76288: filtering new block on tags 49116 1727204682.76313: done filtering new block on tags 49116 1727204682.76317: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 49116 1727204682.76323: extending task lists for all hosts with included blocks 49116 1727204682.76534: done extending task lists 49116 1727204682.76536: done processing included files 49116 1727204682.76536: results queue empty 49116 1727204682.76537: checking for any_errors_fatal 49116 1727204682.76540: done checking for any_errors_fatal 49116 1727204682.76541: checking for max_fail_percentage 49116 1727204682.76543: done checking for max_fail_percentage 49116 1727204682.76543: checking to see if all hosts have failed and the running result is not ok 49116 1727204682.76544: done checking to see if all hosts have failed 49116 1727204682.76545: getting the remaining hosts for this loop 49116 1727204682.76546: done getting the remaining hosts for this loop 49116 1727204682.76548: getting the next task for host managed-node3 49116 1727204682.76552: done getting next task for host managed-node3 49116 1727204682.76554: ^ task is: TASK: Gather current interface info 49116 1727204682.76557: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204682.76559: getting variables 49116 1727204682.76560: in VariableManager get_vars() 49116 1727204682.76580: Calling all_inventory to load vars for managed-node3 49116 1727204682.76583: Calling groups_inventory to load vars for managed-node3 49116 1727204682.76585: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204682.76591: Calling all_plugins_play to load vars for managed-node3 49116 1727204682.76594: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204682.76596: Calling groups_plugins_play to load vars for managed-node3 49116 1727204682.77167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204682.77654: done with get_vars() 49116 1727204682.77671: done getting variables 49116 1727204682.77969: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:04:42 -0400 (0:00:00.082) 0:00:05.805 ***** 49116 1727204682.78003: entering _queue_task() for managed-node3/command 49116 1727204682.78744: worker is 1 (out of 1 available) 49116 1727204682.78758: exiting _queue_task() for managed-node3/command 49116 1727204682.78774: done queuing things up, now waiting for results queue to drain 49116 1727204682.78775: waiting for pending results... 49116 1727204682.79144: running TaskExecutor() for managed-node3/TASK: Gather current interface info 49116 1727204682.79599: in run() - task 127b8e07-fff9-02f7-957b-0000000001ff 49116 1727204682.79605: variable 'ansible_search_path' from source: unknown 49116 1727204682.79609: variable 'ansible_search_path' from source: unknown 49116 1727204682.79612: calling self._execute() 49116 1727204682.79835: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204682.79918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204682.79922: variable 'omit' from source: magic vars 49116 1727204682.80759: variable 'ansible_distribution_major_version' from source: facts 49116 1727204682.80782: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204682.80973: variable 'omit' from source: magic vars 49116 1727204682.80976: variable 'omit' from source: magic vars 49116 1727204682.81093: variable 'omit' from source: magic vars 49116 1727204682.81152: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204682.81335: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204682.81345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204682.81371: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204682.81419: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204682.81491: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204682.81500: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204682.81519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204682.81782: Set connection var ansible_connection to ssh 49116 1727204682.81801: Set connection var ansible_timeout to 10 49116 1727204682.81851: Set connection var ansible_shell_executable to /bin/sh 49116 1727204682.81861: Set connection var ansible_pipelining to False 49116 1727204682.81873: Set connection var ansible_shell_type to sh 49116 1727204682.82057: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204682.82061: variable 'ansible_shell_executable' from source: unknown 49116 1727204682.82063: variable 'ansible_connection' from source: unknown 49116 1727204682.82067: variable 'ansible_module_compression' from source: unknown 49116 1727204682.82070: variable 'ansible_shell_type' from source: unknown 49116 1727204682.82072: variable 'ansible_shell_executable' from source: unknown 49116 1727204682.82074: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204682.82076: variable 'ansible_pipelining' from source: unknown 49116 1727204682.82078: variable 'ansible_timeout' from source: unknown 49116 1727204682.82081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204682.82420: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204682.82442: variable 'omit' from source: magic vars 49116 1727204682.82498: starting attempt loop 49116 1727204682.82526: running the handler 49116 1727204682.82529: _low_level_execute_command(): starting 49116 1727204682.82542: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204682.83985: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204682.83992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204682.84562: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204682.84569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204682.84774: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204682.84790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204682.84999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204682.87351: stdout chunk (state=3): >>>/root <<< 49116 1727204682.87355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204682.87358: stdout chunk (state=3): >>><<< 49116 1727204682.87360: stderr chunk (state=3): >>><<< 49116 1727204682.87364: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204682.87369: _low_level_execute_command(): starting 49116 1727204682.87372: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204682.8725848-49525-261745667852795 `" && echo ansible-tmp-1727204682.8725848-49525-261745667852795="` echo /root/.ansible/tmp/ansible-tmp-1727204682.8725848-49525-261745667852795 `" ) && sleep 0' 49116 1727204682.89297: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204682.89411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204682.89529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204682.91719: stdout chunk (state=3): >>>ansible-tmp-1727204682.8725848-49525-261745667852795=/root/.ansible/tmp/ansible-tmp-1727204682.8725848-49525-261745667852795 <<< 49116 1727204682.91902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204682.91951: stderr chunk (state=3): >>><<< 49116 1727204682.92003: stdout chunk (state=3): >>><<< 49116 1727204682.92031: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204682.8725848-49525-261745667852795=/root/.ansible/tmp/ansible-tmp-1727204682.8725848-49525-261745667852795 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204682.92144: variable 'ansible_module_compression' from source: unknown 49116 1727204682.92271: ANSIBALLZ: Using generic lock for ansible.legacy.command 49116 1727204682.92279: ANSIBALLZ: Acquiring lock 49116 1727204682.92287: ANSIBALLZ: Lock acquired: 139720119767104 49116 1727204682.92295: ANSIBALLZ: Creating module 49116 1727204683.38859: ANSIBALLZ: Writing module into payload 49116 1727204683.39158: ANSIBALLZ: Writing module 49116 1727204683.39195: ANSIBALLZ: Renaming module 49116 1727204683.39284: ANSIBALLZ: Done creating module 49116 1727204683.39372: variable 'ansible_facts' from source: unknown 49116 1727204683.39481: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204682.8725848-49525-261745667852795/AnsiballZ_command.py 49116 1727204683.40074: Sending initial data 49116 1727204683.40078: Sent initial data (156 bytes) 49116 1727204683.41770: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204683.42043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204683.42480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204683.44376: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204683.44480: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204683.44554: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpnxsbewwu /root/.ansible/tmp/ansible-tmp-1727204682.8725848-49525-261745667852795/AnsiballZ_command.py <<< 49116 1727204683.44557: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204682.8725848-49525-261745667852795/AnsiballZ_command.py" <<< 49116 1727204683.45074: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpnxsbewwu" to remote "/root/.ansible/tmp/ansible-tmp-1727204682.8725848-49525-261745667852795/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204682.8725848-49525-261745667852795/AnsiballZ_command.py" <<< 49116 1727204683.47609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204683.47613: stdout chunk (state=3): >>><<< 49116 1727204683.47617: stderr chunk (state=3): >>><<< 49116 1727204683.47629: done transferring module to remote 49116 1727204683.47648: _low_level_execute_command(): starting 49116 1727204683.47658: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204682.8725848-49525-261745667852795/ /root/.ansible/tmp/ansible-tmp-1727204682.8725848-49525-261745667852795/AnsiballZ_command.py && sleep 0' 49116 1727204683.49555: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204683.49576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204683.49802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204683.49820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204683.50012: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204683.50286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204683.52344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204683.52746: stderr chunk (state=3): >>><<< 49116 1727204683.52757: stdout chunk (state=3): >>><<< 49116 1727204683.52761: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204683.52763: _low_level_execute_command(): starting 49116 1727204683.52768: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204682.8725848-49525-261745667852795/AnsiballZ_command.py && sleep 0' 49116 1727204683.54331: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204683.54354: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204683.54594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204683.54795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204683.54928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204683.73833: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:04:43.732849", "end": "2024-09-24 15:04:43.736689", "delta": "0:00:00.003840", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49116 1727204683.75975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204683.75980: stdout chunk (state=3): >>><<< 49116 1727204683.75982: stderr chunk (state=3): >>><<< 49116 1727204683.75987: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:04:43.732849", "end": "2024-09-24 15:04:43.736689", "delta": "0:00:00.003840", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204683.75989: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204682.8725848-49525-261745667852795/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204683.75992: _low_level_execute_command(): starting 49116 1727204683.75994: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204682.8725848-49525-261745667852795/ > /dev/null 2>&1 && sleep 0' 49116 1727204683.77760: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204683.77992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204683.78092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204683.78222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204683.80474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204683.80478: stdout chunk (state=3): >>><<< 49116 1727204683.80487: stderr chunk (state=3): >>><<< 49116 1727204683.80490: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204683.80492: handler run complete 49116 1727204683.80607: Evaluated conditional (False): False 49116 1727204683.80622: attempt loop complete, returning result 49116 1727204683.80625: _execute() done 49116 1727204683.80628: dumping result to json 49116 1727204683.80635: done dumping result, returning 49116 1727204683.80643: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [127b8e07-fff9-02f7-957b-0000000001ff] 49116 1727204683.80675: sending task result for task 127b8e07-fff9-02f7-957b-0000000001ff 49116 1727204683.81008: done sending task result for task 127b8e07-fff9-02f7-957b-0000000001ff 49116 1727204683.81011: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003840", "end": "2024-09-24 15:04:43.736689", "rc": 0, "start": "2024-09-24 15:04:43.732849" } STDOUT: bonding_masters eth0 lo 49116 1727204683.81119: no more pending results, returning what we have 49116 1727204683.81122: results queue empty 49116 1727204683.81123: checking for any_errors_fatal 49116 1727204683.81125: done checking for any_errors_fatal 49116 1727204683.81125: checking for max_fail_percentage 49116 1727204683.81127: done checking for max_fail_percentage 49116 1727204683.81128: checking to see if all hosts have failed and the running result is not ok 49116 1727204683.81129: done checking to see if all hosts have failed 49116 1727204683.81130: getting the remaining hosts for this loop 49116 1727204683.81131: done getting the remaining hosts for this loop 49116 1727204683.81138: getting the next task for host managed-node3 49116 1727204683.81145: done getting next task for host managed-node3 49116 1727204683.81148: ^ task is: TASK: Set current_interfaces 49116 1727204683.81152: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204683.81156: getting variables 49116 1727204683.81157: in VariableManager get_vars() 49116 1727204683.81204: Calling all_inventory to load vars for managed-node3 49116 1727204683.81207: Calling groups_inventory to load vars for managed-node3 49116 1727204683.81209: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204683.81222: Calling all_plugins_play to load vars for managed-node3 49116 1727204683.81225: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204683.81227: Calling groups_plugins_play to load vars for managed-node3 49116 1727204683.81934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204683.82985: done with get_vars() 49116 1727204683.83000: done getting variables 49116 1727204683.83064: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:04:43 -0400 (0:00:01.050) 0:00:06.856 ***** 49116 1727204683.83103: entering _queue_task() for managed-node3/set_fact 49116 1727204683.84306: worker is 1 (out of 1 available) 49116 1727204683.84322: exiting _queue_task() for managed-node3/set_fact 49116 1727204683.84340: done queuing things up, now waiting for results queue to drain 49116 1727204683.84342: waiting for pending results... 49116 1727204683.84851: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 49116 1727204683.85273: in run() - task 127b8e07-fff9-02f7-957b-000000000200 49116 1727204683.85277: variable 'ansible_search_path' from source: unknown 49116 1727204683.85280: variable 'ansible_search_path' from source: unknown 49116 1727204683.85283: calling self._execute() 49116 1727204683.85332: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204683.85383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204683.85399: variable 'omit' from source: magic vars 49116 1727204683.86473: variable 'ansible_distribution_major_version' from source: facts 49116 1727204683.86478: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204683.86480: variable 'omit' from source: magic vars 49116 1727204683.86671: variable 'omit' from source: magic vars 49116 1727204683.86872: variable '_current_interfaces' from source: set_fact 49116 1727204683.87100: variable 'omit' from source: magic vars 49116 1727204683.87149: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204683.87316: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204683.87343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204683.87370: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204683.87391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204683.87428: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204683.87505: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204683.87513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204683.87738: Set connection var ansible_connection to ssh 49116 1727204683.87757: Set connection var ansible_timeout to 10 49116 1727204683.87771: Set connection var ansible_shell_executable to /bin/sh 49116 1727204683.87782: Set connection var ansible_pipelining to False 49116 1727204683.87829: Set connection var ansible_shell_type to sh 49116 1727204683.87841: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204683.87877: variable 'ansible_shell_executable' from source: unknown 49116 1727204683.88042: variable 'ansible_connection' from source: unknown 49116 1727204683.88046: variable 'ansible_module_compression' from source: unknown 49116 1727204683.88048: variable 'ansible_shell_type' from source: unknown 49116 1727204683.88050: variable 'ansible_shell_executable' from source: unknown 49116 1727204683.88052: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204683.88054: variable 'ansible_pipelining' from source: unknown 49116 1727204683.88056: variable 'ansible_timeout' from source: unknown 49116 1727204683.88058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204683.88327: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204683.88346: variable 'omit' from source: magic vars 49116 1727204683.88375: starting attempt loop 49116 1727204683.88382: running the handler 49116 1727204683.88571: handler run complete 49116 1727204683.88574: attempt loop complete, returning result 49116 1727204683.88578: _execute() done 49116 1727204683.88581: dumping result to json 49116 1727204683.88583: done dumping result, returning 49116 1727204683.88586: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [127b8e07-fff9-02f7-957b-000000000200] 49116 1727204683.88589: sending task result for task 127b8e07-fff9-02f7-957b-000000000200 49116 1727204683.88772: done sending task result for task 127b8e07-fff9-02f7-957b-000000000200 49116 1727204683.88892: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 49116 1727204683.88961: no more pending results, returning what we have 49116 1727204683.88965: results queue empty 49116 1727204683.88968: checking for any_errors_fatal 49116 1727204683.88974: done checking for any_errors_fatal 49116 1727204683.88975: checking for max_fail_percentage 49116 1727204683.88977: done checking for max_fail_percentage 49116 1727204683.88978: checking to see if all hosts have failed and the running result is not ok 49116 1727204683.88979: done checking to see if all hosts have failed 49116 1727204683.88979: getting the remaining hosts for this loop 49116 1727204683.88981: done getting the remaining hosts for this loop 49116 1727204683.88986: getting the next task for host managed-node3 49116 1727204683.88994: done getting next task for host managed-node3 49116 1727204683.88997: ^ task is: TASK: Show current_interfaces 49116 1727204683.89000: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204683.89004: getting variables 49116 1727204683.89006: in VariableManager get_vars() 49116 1727204683.89052: Calling all_inventory to load vars for managed-node3 49116 1727204683.89055: Calling groups_inventory to load vars for managed-node3 49116 1727204683.89058: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204683.89473: Calling all_plugins_play to load vars for managed-node3 49116 1727204683.89478: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204683.89483: Calling groups_plugins_play to load vars for managed-node3 49116 1727204683.90149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204683.90812: done with get_vars() 49116 1727204683.90828: done getting variables 49116 1727204683.91338: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:04:43 -0400 (0:00:00.082) 0:00:06.938 ***** 49116 1727204683.91373: entering _queue_task() for managed-node3/debug 49116 1727204683.91375: Creating lock for debug 49116 1727204683.92553: worker is 1 (out of 1 available) 49116 1727204683.92571: exiting _queue_task() for managed-node3/debug 49116 1727204683.92584: done queuing things up, now waiting for results queue to drain 49116 1727204683.92586: waiting for pending results... 49116 1727204683.93125: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 49116 1727204683.93335: in run() - task 127b8e07-fff9-02f7-957b-000000000121 49116 1727204683.93419: variable 'ansible_search_path' from source: unknown 49116 1727204683.93509: variable 'ansible_search_path' from source: unknown 49116 1727204683.93529: calling self._execute() 49116 1727204683.93838: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204683.93843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204683.93847: variable 'omit' from source: magic vars 49116 1727204683.94711: variable 'ansible_distribution_major_version' from source: facts 49116 1727204683.94790: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204683.94804: variable 'omit' from source: magic vars 49116 1727204683.94859: variable 'omit' from source: magic vars 49116 1727204683.95176: variable 'current_interfaces' from source: set_fact 49116 1727204683.95262: variable 'omit' from source: magic vars 49116 1727204683.95360: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204683.95427: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204683.95493: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204683.95561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204683.95875: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204683.95879: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204683.95881: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204683.95883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204683.95885: Set connection var ansible_connection to ssh 49116 1727204683.95887: Set connection var ansible_timeout to 10 49116 1727204683.95889: Set connection var ansible_shell_executable to /bin/sh 49116 1727204683.95891: Set connection var ansible_pipelining to False 49116 1727204683.95987: Set connection var ansible_shell_type to sh 49116 1727204683.96000: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204683.96123: variable 'ansible_shell_executable' from source: unknown 49116 1727204683.96136: variable 'ansible_connection' from source: unknown 49116 1727204683.96144: variable 'ansible_module_compression' from source: unknown 49116 1727204683.96152: variable 'ansible_shell_type' from source: unknown 49116 1727204683.96158: variable 'ansible_shell_executable' from source: unknown 49116 1727204683.96165: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204683.96176: variable 'ansible_pipelining' from source: unknown 49116 1727204683.96183: variable 'ansible_timeout' from source: unknown 49116 1727204683.96191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204683.96752: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204683.96759: variable 'omit' from source: magic vars 49116 1727204683.96762: starting attempt loop 49116 1727204683.96764: running the handler 49116 1727204683.96767: handler run complete 49116 1727204683.96772: attempt loop complete, returning result 49116 1727204683.96866: _execute() done 49116 1727204683.96876: dumping result to json 49116 1727204683.96883: done dumping result, returning 49116 1727204683.96895: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [127b8e07-fff9-02f7-957b-000000000121] 49116 1727204683.96904: sending task result for task 127b8e07-fff9-02f7-957b-000000000121 49116 1727204683.97029: done sending task result for task 127b8e07-fff9-02f7-957b-000000000121 ok: [managed-node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 49116 1727204683.97100: no more pending results, returning what we have 49116 1727204683.97103: results queue empty 49116 1727204683.97104: checking for any_errors_fatal 49116 1727204683.97110: done checking for any_errors_fatal 49116 1727204683.97111: checking for max_fail_percentage 49116 1727204683.97113: done checking for max_fail_percentage 49116 1727204683.97114: checking to see if all hosts have failed and the running result is not ok 49116 1727204683.97114: done checking to see if all hosts have failed 49116 1727204683.97115: getting the remaining hosts for this loop 49116 1727204683.97117: done getting the remaining hosts for this loop 49116 1727204683.97121: getting the next task for host managed-node3 49116 1727204683.97129: done getting next task for host managed-node3 49116 1727204683.97132: ^ task is: TASK: Include the task 'manage_test_interface.yml' 49116 1727204683.97134: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204683.97138: getting variables 49116 1727204683.97140: in VariableManager get_vars() 49116 1727204683.97193: Calling all_inventory to load vars for managed-node3 49116 1727204683.97197: Calling groups_inventory to load vars for managed-node3 49116 1727204683.97199: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204683.97213: Calling all_plugins_play to load vars for managed-node3 49116 1727204683.97218: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204683.97222: Calling groups_plugins_play to load vars for managed-node3 49116 1727204683.97976: WORKER PROCESS EXITING 49116 1727204683.98124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204683.99118: done with get_vars() 49116 1727204683.99135: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:12 Tuesday 24 September 2024 15:04:43 -0400 (0:00:00.081) 0:00:07.019 ***** 49116 1727204683.99482: entering _queue_task() for managed-node3/include_tasks 49116 1727204684.00516: worker is 1 (out of 1 available) 49116 1727204684.00534: exiting _queue_task() for managed-node3/include_tasks 49116 1727204684.00548: done queuing things up, now waiting for results queue to drain 49116 1727204684.00549: waiting for pending results... 49116 1727204684.01292: running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' 49116 1727204684.01437: in run() - task 127b8e07-fff9-02f7-957b-00000000000c 49116 1727204684.01520: variable 'ansible_search_path' from source: unknown 49116 1727204684.01628: calling self._execute() 49116 1727204684.02074: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204684.02192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204684.02196: variable 'omit' from source: magic vars 49116 1727204684.03272: variable 'ansible_distribution_major_version' from source: facts 49116 1727204684.03313: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204684.03445: _execute() done 49116 1727204684.03449: dumping result to json 49116 1727204684.03451: done dumping result, returning 49116 1727204684.03454: done running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' [127b8e07-fff9-02f7-957b-00000000000c] 49116 1727204684.03458: sending task result for task 127b8e07-fff9-02f7-957b-00000000000c 49116 1727204684.03889: no more pending results, returning what we have 49116 1727204684.03895: in VariableManager get_vars() 49116 1727204684.03960: Calling all_inventory to load vars for managed-node3 49116 1727204684.03963: Calling groups_inventory to load vars for managed-node3 49116 1727204684.03967: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204684.03984: Calling all_plugins_play to load vars for managed-node3 49116 1727204684.03987: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204684.03990: Calling groups_plugins_play to load vars for managed-node3 49116 1727204684.04737: done sending task result for task 127b8e07-fff9-02f7-957b-00000000000c 49116 1727204684.04743: WORKER PROCESS EXITING 49116 1727204684.04794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204684.05364: done with get_vars() 49116 1727204684.05469: variable 'ansible_search_path' from source: unknown 49116 1727204684.05490: we have included files to process 49116 1727204684.05492: generating all_blocks data 49116 1727204684.05494: done generating all_blocks data 49116 1727204684.05499: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 49116 1727204684.05501: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 49116 1727204684.05504: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 49116 1727204684.07595: in VariableManager get_vars() 49116 1727204684.07623: done with get_vars() 49116 1727204684.08404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 49116 1727204684.09852: done processing included file 49116 1727204684.09854: iterating over new_blocks loaded from include file 49116 1727204684.09856: in VariableManager get_vars() 49116 1727204684.09885: done with get_vars() 49116 1727204684.09968: filtering new block on tags 49116 1727204684.10015: done filtering new block on tags 49116 1727204684.10019: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node3 49116 1727204684.10691: extending task lists for all hosts with included blocks 49116 1727204684.16933: done extending task lists 49116 1727204684.16936: done processing included files 49116 1727204684.16936: results queue empty 49116 1727204684.16937: checking for any_errors_fatal 49116 1727204684.16941: done checking for any_errors_fatal 49116 1727204684.16942: checking for max_fail_percentage 49116 1727204684.16943: done checking for max_fail_percentage 49116 1727204684.16944: checking to see if all hosts have failed and the running result is not ok 49116 1727204684.16945: done checking to see if all hosts have failed 49116 1727204684.16945: getting the remaining hosts for this loop 49116 1727204684.16947: done getting the remaining hosts for this loop 49116 1727204684.16950: getting the next task for host managed-node3 49116 1727204684.16954: done getting next task for host managed-node3 49116 1727204684.17070: ^ task is: TASK: Ensure state in ["present", "absent"] 49116 1727204684.17074: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204684.17083: getting variables 49116 1727204684.17084: in VariableManager get_vars() 49116 1727204684.17124: Calling all_inventory to load vars for managed-node3 49116 1727204684.17127: Calling groups_inventory to load vars for managed-node3 49116 1727204684.17129: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204684.17137: Calling all_plugins_play to load vars for managed-node3 49116 1727204684.17140: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204684.17143: Calling groups_plugins_play to load vars for managed-node3 49116 1727204684.17629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204684.18432: done with get_vars() 49116 1727204684.18449: done getting variables 49116 1727204684.18695: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 15:04:44 -0400 (0:00:00.193) 0:00:07.213 ***** 49116 1727204684.18848: entering _queue_task() for managed-node3/fail 49116 1727204684.18851: Creating lock for fail 49116 1727204684.19663: worker is 1 (out of 1 available) 49116 1727204684.19678: exiting _queue_task() for managed-node3/fail 49116 1727204684.19691: done queuing things up, now waiting for results queue to drain 49116 1727204684.19693: waiting for pending results... 49116 1727204684.20474: running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] 49116 1727204684.20513: in run() - task 127b8e07-fff9-02f7-957b-00000000021b 49116 1727204684.20538: variable 'ansible_search_path' from source: unknown 49116 1727204684.20614: variable 'ansible_search_path' from source: unknown 49116 1727204684.20652: calling self._execute() 49116 1727204684.20943: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204684.20947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204684.20950: variable 'omit' from source: magic vars 49116 1727204684.21974: variable 'ansible_distribution_major_version' from source: facts 49116 1727204684.21978: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204684.22443: variable 'state' from source: include params 49116 1727204684.22447: Evaluated conditional (state not in ["present", "absent"]): False 49116 1727204684.22450: when evaluation is False, skipping this task 49116 1727204684.22452: _execute() done 49116 1727204684.22459: dumping result to json 49116 1727204684.22471: done dumping result, returning 49116 1727204684.22483: done running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] [127b8e07-fff9-02f7-957b-00000000021b] 49116 1727204684.22494: sending task result for task 127b8e07-fff9-02f7-957b-00000000021b skipping: [managed-node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 49116 1727204684.22713: no more pending results, returning what we have 49116 1727204684.22717: results queue empty 49116 1727204684.22719: checking for any_errors_fatal 49116 1727204684.22720: done checking for any_errors_fatal 49116 1727204684.22721: checking for max_fail_percentage 49116 1727204684.22723: done checking for max_fail_percentage 49116 1727204684.22724: checking to see if all hosts have failed and the running result is not ok 49116 1727204684.22725: done checking to see if all hosts have failed 49116 1727204684.22726: getting the remaining hosts for this loop 49116 1727204684.22727: done getting the remaining hosts for this loop 49116 1727204684.22732: getting the next task for host managed-node3 49116 1727204684.22739: done getting next task for host managed-node3 49116 1727204684.22742: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 49116 1727204684.22746: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204684.22750: getting variables 49116 1727204684.22752: in VariableManager get_vars() 49116 1727204684.22808: Calling all_inventory to load vars for managed-node3 49116 1727204684.22812: Calling groups_inventory to load vars for managed-node3 49116 1727204684.22816: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204684.22832: Calling all_plugins_play to load vars for managed-node3 49116 1727204684.22836: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204684.22839: Calling groups_plugins_play to load vars for managed-node3 49116 1727204684.24068: done sending task result for task 127b8e07-fff9-02f7-957b-00000000021b 49116 1727204684.24074: WORKER PROCESS EXITING 49116 1727204684.24121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204684.24829: done with get_vars() 49116 1727204684.24961: done getting variables 49116 1727204684.25027: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 15:04:44 -0400 (0:00:00.062) 0:00:07.275 ***** 49116 1727204684.25269: entering _queue_task() for managed-node3/fail 49116 1727204684.25782: worker is 1 (out of 1 available) 49116 1727204684.25795: exiting _queue_task() for managed-node3/fail 49116 1727204684.26030: done queuing things up, now waiting for results queue to drain 49116 1727204684.26032: waiting for pending results... 49116 1727204684.26415: running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] 49116 1727204684.26612: in run() - task 127b8e07-fff9-02f7-957b-00000000021c 49116 1727204684.26703: variable 'ansible_search_path' from source: unknown 49116 1727204684.26713: variable 'ansible_search_path' from source: unknown 49116 1727204684.26885: calling self._execute() 49116 1727204684.27339: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204684.27344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204684.27347: variable 'omit' from source: magic vars 49116 1727204684.28253: variable 'ansible_distribution_major_version' from source: facts 49116 1727204684.28313: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204684.28894: variable 'type' from source: play vars 49116 1727204684.28899: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 49116 1727204684.28901: when evaluation is False, skipping this task 49116 1727204684.28903: _execute() done 49116 1727204684.28906: dumping result to json 49116 1727204684.28909: done dumping result, returning 49116 1727204684.28912: done running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] [127b8e07-fff9-02f7-957b-00000000021c] 49116 1727204684.28914: sending task result for task 127b8e07-fff9-02f7-957b-00000000021c 49116 1727204684.29241: done sending task result for task 127b8e07-fff9-02f7-957b-00000000021c 49116 1727204684.29246: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 49116 1727204684.29310: no more pending results, returning what we have 49116 1727204684.29315: results queue empty 49116 1727204684.29316: checking for any_errors_fatal 49116 1727204684.29323: done checking for any_errors_fatal 49116 1727204684.29324: checking for max_fail_percentage 49116 1727204684.29326: done checking for max_fail_percentage 49116 1727204684.29327: checking to see if all hosts have failed and the running result is not ok 49116 1727204684.29328: done checking to see if all hosts have failed 49116 1727204684.29329: getting the remaining hosts for this loop 49116 1727204684.29330: done getting the remaining hosts for this loop 49116 1727204684.29335: getting the next task for host managed-node3 49116 1727204684.29343: done getting next task for host managed-node3 49116 1727204684.29346: ^ task is: TASK: Include the task 'show_interfaces.yml' 49116 1727204684.29350: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204684.29356: getting variables 49116 1727204684.29358: in VariableManager get_vars() 49116 1727204684.29411: Calling all_inventory to load vars for managed-node3 49116 1727204684.29414: Calling groups_inventory to load vars for managed-node3 49116 1727204684.29416: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204684.29432: Calling all_plugins_play to load vars for managed-node3 49116 1727204684.29436: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204684.29440: Calling groups_plugins_play to load vars for managed-node3 49116 1727204684.30240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204684.30819: done with get_vars() 49116 1727204684.30833: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 15:04:44 -0400 (0:00:00.059) 0:00:07.335 ***** 49116 1727204684.31055: entering _queue_task() for managed-node3/include_tasks 49116 1727204684.31826: worker is 1 (out of 1 available) 49116 1727204684.31957: exiting _queue_task() for managed-node3/include_tasks 49116 1727204684.31975: done queuing things up, now waiting for results queue to drain 49116 1727204684.31977: waiting for pending results... 49116 1727204684.32707: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 49116 1727204684.32829: in run() - task 127b8e07-fff9-02f7-957b-00000000021d 49116 1727204684.32834: variable 'ansible_search_path' from source: unknown 49116 1727204684.32836: variable 'ansible_search_path' from source: unknown 49116 1727204684.32839: calling self._execute() 49116 1727204684.33020: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204684.33238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204684.33242: variable 'omit' from source: magic vars 49116 1727204684.34003: variable 'ansible_distribution_major_version' from source: facts 49116 1727204684.34033: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204684.34045: _execute() done 49116 1727204684.34080: dumping result to json 49116 1727204684.34088: done dumping result, returning 49116 1727204684.34098: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [127b8e07-fff9-02f7-957b-00000000021d] 49116 1727204684.34115: sending task result for task 127b8e07-fff9-02f7-957b-00000000021d 49116 1727204684.34433: done sending task result for task 127b8e07-fff9-02f7-957b-00000000021d 49116 1727204684.34437: WORKER PROCESS EXITING 49116 1727204684.34483: no more pending results, returning what we have 49116 1727204684.34489: in VariableManager get_vars() 49116 1727204684.34543: Calling all_inventory to load vars for managed-node3 49116 1727204684.34547: Calling groups_inventory to load vars for managed-node3 49116 1727204684.34549: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204684.34571: Calling all_plugins_play to load vars for managed-node3 49116 1727204684.34575: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204684.34579: Calling groups_plugins_play to load vars for managed-node3 49116 1727204684.35125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204684.35730: done with get_vars() 49116 1727204684.35856: variable 'ansible_search_path' from source: unknown 49116 1727204684.35858: variable 'ansible_search_path' from source: unknown 49116 1727204684.35905: we have included files to process 49116 1727204684.35906: generating all_blocks data 49116 1727204684.35908: done generating all_blocks data 49116 1727204684.35913: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 49116 1727204684.35914: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 49116 1727204684.35917: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 49116 1727204684.36150: in VariableManager get_vars() 49116 1727204684.36292: done with get_vars() 49116 1727204684.36701: done processing included file 49116 1727204684.36703: iterating over new_blocks loaded from include file 49116 1727204684.36705: in VariableManager get_vars() 49116 1727204684.36728: done with get_vars() 49116 1727204684.36730: filtering new block on tags 49116 1727204684.36751: done filtering new block on tags 49116 1727204684.36754: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 49116 1727204684.36760: extending task lists for all hosts with included blocks 49116 1727204684.37747: done extending task lists 49116 1727204684.37749: done processing included files 49116 1727204684.37750: results queue empty 49116 1727204684.37750: checking for any_errors_fatal 49116 1727204684.37754: done checking for any_errors_fatal 49116 1727204684.37755: checking for max_fail_percentage 49116 1727204684.37756: done checking for max_fail_percentage 49116 1727204684.37757: checking to see if all hosts have failed and the running result is not ok 49116 1727204684.37758: done checking to see if all hosts have failed 49116 1727204684.37873: getting the remaining hosts for this loop 49116 1727204684.37875: done getting the remaining hosts for this loop 49116 1727204684.37879: getting the next task for host managed-node3 49116 1727204684.37884: done getting next task for host managed-node3 49116 1727204684.37886: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 49116 1727204684.37889: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204684.37892: getting variables 49116 1727204684.37893: in VariableManager get_vars() 49116 1727204684.37951: Calling all_inventory to load vars for managed-node3 49116 1727204684.37954: Calling groups_inventory to load vars for managed-node3 49116 1727204684.37957: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204684.37963: Calling all_plugins_play to load vars for managed-node3 49116 1727204684.38072: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204684.38078: Calling groups_plugins_play to load vars for managed-node3 49116 1727204684.38429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204684.38963: done with get_vars() 49116 1727204684.38981: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:04:44 -0400 (0:00:00.080) 0:00:07.415 ***** 49116 1727204684.39189: entering _queue_task() for managed-node3/include_tasks 49116 1727204684.40322: worker is 1 (out of 1 available) 49116 1727204684.40335: exiting _queue_task() for managed-node3/include_tasks 49116 1727204684.40350: done queuing things up, now waiting for results queue to drain 49116 1727204684.40351: waiting for pending results... 49116 1727204684.40976: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 49116 1727204684.41721: in run() - task 127b8e07-fff9-02f7-957b-000000000314 49116 1727204684.41726: variable 'ansible_search_path' from source: unknown 49116 1727204684.41732: variable 'ansible_search_path' from source: unknown 49116 1727204684.41735: calling self._execute() 49116 1727204684.41854: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204684.41858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204684.41895: variable 'omit' from source: magic vars 49116 1727204684.42774: variable 'ansible_distribution_major_version' from source: facts 49116 1727204684.42778: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204684.42781: _execute() done 49116 1727204684.42783: dumping result to json 49116 1727204684.42785: done dumping result, returning 49116 1727204684.42788: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [127b8e07-fff9-02f7-957b-000000000314] 49116 1727204684.42790: sending task result for task 127b8e07-fff9-02f7-957b-000000000314 49116 1727204684.43243: no more pending results, returning what we have 49116 1727204684.43247: in VariableManager get_vars() 49116 1727204684.43298: Calling all_inventory to load vars for managed-node3 49116 1727204684.43302: Calling groups_inventory to load vars for managed-node3 49116 1727204684.43304: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204684.43317: Calling all_plugins_play to load vars for managed-node3 49116 1727204684.43320: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204684.43324: Calling groups_plugins_play to load vars for managed-node3 49116 1727204684.43713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204684.44214: done with get_vars() 49116 1727204684.44225: variable 'ansible_search_path' from source: unknown 49116 1727204684.44227: variable 'ansible_search_path' from source: unknown 49116 1727204684.44361: done sending task result for task 127b8e07-fff9-02f7-957b-000000000314 49116 1727204684.44366: WORKER PROCESS EXITING 49116 1727204684.44433: we have included files to process 49116 1727204684.44435: generating all_blocks data 49116 1727204684.44437: done generating all_blocks data 49116 1727204684.44438: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 49116 1727204684.44439: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 49116 1727204684.44442: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 49116 1727204684.45183: done processing included file 49116 1727204684.45186: iterating over new_blocks loaded from include file 49116 1727204684.45188: in VariableManager get_vars() 49116 1727204684.45213: done with get_vars() 49116 1727204684.45471: filtering new block on tags 49116 1727204684.45503: done filtering new block on tags 49116 1727204684.45506: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 49116 1727204684.45511: extending task lists for all hosts with included blocks 49116 1727204684.45808: done extending task lists 49116 1727204684.45809: done processing included files 49116 1727204684.45810: results queue empty 49116 1727204684.45811: checking for any_errors_fatal 49116 1727204684.45814: done checking for any_errors_fatal 49116 1727204684.45815: checking for max_fail_percentage 49116 1727204684.45944: done checking for max_fail_percentage 49116 1727204684.45946: checking to see if all hosts have failed and the running result is not ok 49116 1727204684.45947: done checking to see if all hosts have failed 49116 1727204684.45948: getting the remaining hosts for this loop 49116 1727204684.45949: done getting the remaining hosts for this loop 49116 1727204684.45953: getting the next task for host managed-node3 49116 1727204684.45959: done getting next task for host managed-node3 49116 1727204684.45961: ^ task is: TASK: Gather current interface info 49116 1727204684.45965: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204684.46043: getting variables 49116 1727204684.46045: in VariableManager get_vars() 49116 1727204684.46067: Calling all_inventory to load vars for managed-node3 49116 1727204684.46070: Calling groups_inventory to load vars for managed-node3 49116 1727204684.46072: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204684.46086: Calling all_plugins_play to load vars for managed-node3 49116 1727204684.46089: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204684.46092: Calling groups_plugins_play to load vars for managed-node3 49116 1727204684.46494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204684.46984: done with get_vars() 49116 1727204684.46998: done getting variables 49116 1727204684.47114: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:04:44 -0400 (0:00:00.080) 0:00:07.497 ***** 49116 1727204684.47272: entering _queue_task() for managed-node3/command 49116 1727204684.48081: worker is 1 (out of 1 available) 49116 1727204684.48096: exiting _queue_task() for managed-node3/command 49116 1727204684.48112: done queuing things up, now waiting for results queue to drain 49116 1727204684.48114: waiting for pending results... 49116 1727204684.48637: running TaskExecutor() for managed-node3/TASK: Gather current interface info 49116 1727204684.48760: in run() - task 127b8e07-fff9-02f7-957b-00000000034b 49116 1727204684.48981: variable 'ansible_search_path' from source: unknown 49116 1727204684.48986: variable 'ansible_search_path' from source: unknown 49116 1727204684.49028: calling self._execute() 49116 1727204684.49122: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204684.49131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204684.49248: variable 'omit' from source: magic vars 49116 1727204684.50239: variable 'ansible_distribution_major_version' from source: facts 49116 1727204684.50255: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204684.50262: variable 'omit' from source: magic vars 49116 1727204684.50378: variable 'omit' from source: magic vars 49116 1727204684.50694: variable 'omit' from source: magic vars 49116 1727204684.50746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204684.50788: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204684.50812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204684.50839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204684.50911: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204684.51400: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204684.51404: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204684.51408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204684.51545: Set connection var ansible_connection to ssh 49116 1727204684.51701: Set connection var ansible_timeout to 10 49116 1727204684.51715: Set connection var ansible_shell_executable to /bin/sh 49116 1727204684.51718: Set connection var ansible_pipelining to False 49116 1727204684.51721: Set connection var ansible_shell_type to sh 49116 1727204684.51723: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204684.51756: variable 'ansible_shell_executable' from source: unknown 49116 1727204684.51760: variable 'ansible_connection' from source: unknown 49116 1727204684.51763: variable 'ansible_module_compression' from source: unknown 49116 1727204684.51769: variable 'ansible_shell_type' from source: unknown 49116 1727204684.51772: variable 'ansible_shell_executable' from source: unknown 49116 1727204684.51774: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204684.51776: variable 'ansible_pipelining' from source: unknown 49116 1727204684.51779: variable 'ansible_timeout' from source: unknown 49116 1727204684.51880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204684.52697: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204684.52705: variable 'omit' from source: magic vars 49116 1727204684.52707: starting attempt loop 49116 1727204684.52710: running the handler 49116 1727204684.52712: _low_level_execute_command(): starting 49116 1727204684.52715: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204684.55416: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204684.55939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204684.55945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204684.57904: stdout chunk (state=3): >>>/root <<< 49116 1727204684.57910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204684.57916: stdout chunk (state=3): >>><<< 49116 1727204684.57923: stderr chunk (state=3): >>><<< 49116 1727204684.58102: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204684.58105: _low_level_execute_command(): starting 49116 1727204684.58109: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204684.5801485-49702-97775753935429 `" && echo ansible-tmp-1727204684.5801485-49702-97775753935429="` echo /root/.ansible/tmp/ansible-tmp-1727204684.5801485-49702-97775753935429 `" ) && sleep 0' 49116 1727204684.59564: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204684.59573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204684.59625: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204684.59727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204684.59931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204684.60282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204684.62242: stdout chunk (state=3): >>>ansible-tmp-1727204684.5801485-49702-97775753935429=/root/.ansible/tmp/ansible-tmp-1727204684.5801485-49702-97775753935429 <<< 49116 1727204684.62646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204684.62650: stdout chunk (state=3): >>><<< 49116 1727204684.62652: stderr chunk (state=3): >>><<< 49116 1727204684.62655: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204684.5801485-49702-97775753935429=/root/.ansible/tmp/ansible-tmp-1727204684.5801485-49702-97775753935429 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204684.62657: variable 'ansible_module_compression' from source: unknown 49116 1727204684.62716: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 49116 1727204684.62888: variable 'ansible_facts' from source: unknown 49116 1727204684.62969: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204684.5801485-49702-97775753935429/AnsiballZ_command.py 49116 1727204684.63998: Sending initial data 49116 1727204684.64002: Sent initial data (155 bytes) 49116 1727204684.65729: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204684.65734: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204684.65834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204684.65855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204684.65871: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204684.66065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204684.67845: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204684.67877: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204684.68122: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmp5rnqd7_y /root/.ansible/tmp/ansible-tmp-1727204684.5801485-49702-97775753935429/AnsiballZ_command.py <<< 49116 1727204684.68126: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204684.5801485-49702-97775753935429/AnsiballZ_command.py" <<< 49116 1727204684.68688: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmp5rnqd7_y" to remote "/root/.ansible/tmp/ansible-tmp-1727204684.5801485-49702-97775753935429/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204684.5801485-49702-97775753935429/AnsiballZ_command.py" <<< 49116 1727204684.70985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204684.71018: stderr chunk (state=3): >>><<< 49116 1727204684.71026: stdout chunk (state=3): >>><<< 49116 1727204684.71032: done transferring module to remote 49116 1727204684.71051: _low_level_execute_command(): starting 49116 1727204684.71054: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204684.5801485-49702-97775753935429/ /root/.ansible/tmp/ansible-tmp-1727204684.5801485-49702-97775753935429/AnsiballZ_command.py && sleep 0' 49116 1727204684.72880: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204684.73073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204684.73078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204684.75192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204684.75405: stderr chunk (state=3): >>><<< 49116 1727204684.75511: stdout chunk (state=3): >>><<< 49116 1727204684.75695: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204684.75704: _low_level_execute_command(): starting 49116 1727204684.75707: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204684.5801485-49702-97775753935429/AnsiballZ_command.py && sleep 0' 49116 1727204684.78078: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204684.78082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204684.78085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204684.78087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204684.78188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204684.78192: stderr chunk (state=3): >>>debug2: match not found <<< 49116 1727204684.78198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204684.78371: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204684.78403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204684.78408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204684.78777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204684.96572: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:04:44.960672", "end": "2024-09-24 15:04:44.964449", "delta": "0:00:00.003777", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49116 1727204684.98412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204684.98418: stdout chunk (state=3): >>><<< 49116 1727204684.98512: stderr chunk (state=3): >>><<< 49116 1727204684.98603: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:04:44.960672", "end": "2024-09-24 15:04:44.964449", "delta": "0:00:00.003777", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204684.98796: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204684.5801485-49702-97775753935429/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204684.98806: _low_level_execute_command(): starting 49116 1727204684.98812: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204684.5801485-49702-97775753935429/ > /dev/null 2>&1 && sleep 0' 49116 1727204685.00421: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204685.00480: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204685.00660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204685.02818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204685.03223: stderr chunk (state=3): >>><<< 49116 1727204685.03227: stdout chunk (state=3): >>><<< 49116 1727204685.03231: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204685.03238: handler run complete 49116 1727204685.03241: Evaluated conditional (False): False 49116 1727204685.03243: attempt loop complete, returning result 49116 1727204685.03245: _execute() done 49116 1727204685.03247: dumping result to json 49116 1727204685.03249: done dumping result, returning 49116 1727204685.03263: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [127b8e07-fff9-02f7-957b-00000000034b] 49116 1727204685.03276: sending task result for task 127b8e07-fff9-02f7-957b-00000000034b ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003777", "end": "2024-09-24 15:04:44.964449", "rc": 0, "start": "2024-09-24 15:04:44.960672" } STDOUT: bonding_masters eth0 lo 49116 1727204685.03661: no more pending results, returning what we have 49116 1727204685.03667: results queue empty 49116 1727204685.03668: checking for any_errors_fatal 49116 1727204685.03670: done checking for any_errors_fatal 49116 1727204685.03671: checking for max_fail_percentage 49116 1727204685.03673: done checking for max_fail_percentage 49116 1727204685.03674: checking to see if all hosts have failed and the running result is not ok 49116 1727204685.03675: done checking to see if all hosts have failed 49116 1727204685.03676: getting the remaining hosts for this loop 49116 1727204685.03677: done getting the remaining hosts for this loop 49116 1727204685.03682: getting the next task for host managed-node3 49116 1727204685.03691: done getting next task for host managed-node3 49116 1727204685.03694: ^ task is: TASK: Set current_interfaces 49116 1727204685.03700: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204685.03705: getting variables 49116 1727204685.03707: in VariableManager get_vars() 49116 1727204685.04015: Calling all_inventory to load vars for managed-node3 49116 1727204685.04018: Calling groups_inventory to load vars for managed-node3 49116 1727204685.04021: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204685.04031: done sending task result for task 127b8e07-fff9-02f7-957b-00000000034b 49116 1727204685.04035: WORKER PROCESS EXITING 49116 1727204685.04050: Calling all_plugins_play to load vars for managed-node3 49116 1727204685.04054: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204685.04058: Calling groups_plugins_play to load vars for managed-node3 49116 1727204685.04609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204685.05051: done with get_vars() 49116 1727204685.05271: done getting variables 49116 1727204685.05349: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:04:45 -0400 (0:00:00.581) 0:00:08.078 ***** 49116 1727204685.05394: entering _queue_task() for managed-node3/set_fact 49116 1727204685.06176: worker is 1 (out of 1 available) 49116 1727204685.06191: exiting _queue_task() for managed-node3/set_fact 49116 1727204685.06208: done queuing things up, now waiting for results queue to drain 49116 1727204685.06210: waiting for pending results... 49116 1727204685.06907: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 49116 1727204685.07039: in run() - task 127b8e07-fff9-02f7-957b-00000000034c 49116 1727204685.07146: variable 'ansible_search_path' from source: unknown 49116 1727204685.07156: variable 'ansible_search_path' from source: unknown 49116 1727204685.07264: calling self._execute() 49116 1727204685.07485: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204685.07498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204685.07514: variable 'omit' from source: magic vars 49116 1727204685.08558: variable 'ansible_distribution_major_version' from source: facts 49116 1727204685.08585: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204685.08683: variable 'omit' from source: magic vars 49116 1727204685.09085: variable 'omit' from source: magic vars 49116 1727204685.09268: variable '_current_interfaces' from source: set_fact 49116 1727204685.09573: variable 'omit' from source: magic vars 49116 1727204685.09577: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204685.09622: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204685.09650: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204685.09704: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204685.09871: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204685.09875: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204685.09877: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204685.09879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204685.10063: Set connection var ansible_connection to ssh 49116 1727204685.10135: Set connection var ansible_timeout to 10 49116 1727204685.10240: Set connection var ansible_shell_executable to /bin/sh 49116 1727204685.10252: Set connection var ansible_pipelining to False 49116 1727204685.10259: Set connection var ansible_shell_type to sh 49116 1727204685.10270: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204685.10302: variable 'ansible_shell_executable' from source: unknown 49116 1727204685.10343: variable 'ansible_connection' from source: unknown 49116 1727204685.10352: variable 'ansible_module_compression' from source: unknown 49116 1727204685.10360: variable 'ansible_shell_type' from source: unknown 49116 1727204685.10554: variable 'ansible_shell_executable' from source: unknown 49116 1727204685.10559: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204685.10561: variable 'ansible_pipelining' from source: unknown 49116 1727204685.10564: variable 'ansible_timeout' from source: unknown 49116 1727204685.10568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204685.10733: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204685.11100: variable 'omit' from source: magic vars 49116 1727204685.11104: starting attempt loop 49116 1727204685.11107: running the handler 49116 1727204685.11109: handler run complete 49116 1727204685.11111: attempt loop complete, returning result 49116 1727204685.11113: _execute() done 49116 1727204685.11116: dumping result to json 49116 1727204685.11118: done dumping result, returning 49116 1727204685.11121: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [127b8e07-fff9-02f7-957b-00000000034c] 49116 1727204685.11123: sending task result for task 127b8e07-fff9-02f7-957b-00000000034c 49116 1727204685.11200: done sending task result for task 127b8e07-fff9-02f7-957b-00000000034c ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 49116 1727204685.11289: no more pending results, returning what we have 49116 1727204685.11292: results queue empty 49116 1727204685.11294: checking for any_errors_fatal 49116 1727204685.11302: done checking for any_errors_fatal 49116 1727204685.11303: checking for max_fail_percentage 49116 1727204685.11305: done checking for max_fail_percentage 49116 1727204685.11306: checking to see if all hosts have failed and the running result is not ok 49116 1727204685.11307: done checking to see if all hosts have failed 49116 1727204685.11308: getting the remaining hosts for this loop 49116 1727204685.11309: done getting the remaining hosts for this loop 49116 1727204685.11314: getting the next task for host managed-node3 49116 1727204685.11324: done getting next task for host managed-node3 49116 1727204685.11327: ^ task is: TASK: Show current_interfaces 49116 1727204685.11332: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204685.11336: getting variables 49116 1727204685.11338: in VariableManager get_vars() 49116 1727204685.11386: Calling all_inventory to load vars for managed-node3 49116 1727204685.11390: Calling groups_inventory to load vars for managed-node3 49116 1727204685.11392: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204685.11406: Calling all_plugins_play to load vars for managed-node3 49116 1727204685.11409: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204685.11413: Calling groups_plugins_play to load vars for managed-node3 49116 1727204685.12171: WORKER PROCESS EXITING 49116 1727204685.12394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204685.12677: done with get_vars() 49116 1727204685.12690: done getting variables 49116 1727204685.12762: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:04:45 -0400 (0:00:00.074) 0:00:08.153 ***** 49116 1727204685.12878: entering _queue_task() for managed-node3/debug 49116 1727204685.13475: worker is 1 (out of 1 available) 49116 1727204685.13491: exiting _queue_task() for managed-node3/debug 49116 1727204685.13506: done queuing things up, now waiting for results queue to drain 49116 1727204685.13507: waiting for pending results... 49116 1727204685.13876: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 49116 1727204685.13994: in run() - task 127b8e07-fff9-02f7-957b-000000000315 49116 1727204685.14172: variable 'ansible_search_path' from source: unknown 49116 1727204685.14176: variable 'ansible_search_path' from source: unknown 49116 1727204685.14179: calling self._execute() 49116 1727204685.14338: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204685.14580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204685.14771: variable 'omit' from source: magic vars 49116 1727204685.15301: variable 'ansible_distribution_major_version' from source: facts 49116 1727204685.15458: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204685.15528: variable 'omit' from source: magic vars 49116 1727204685.15697: variable 'omit' from source: magic vars 49116 1727204685.16125: variable 'current_interfaces' from source: set_fact 49116 1727204685.16371: variable 'omit' from source: magic vars 49116 1727204685.16375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204685.16378: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204685.16381: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204685.16383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204685.16385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204685.16388: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204685.16391: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204685.16393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204685.16764: Set connection var ansible_connection to ssh 49116 1727204685.16789: Set connection var ansible_timeout to 10 49116 1727204685.16803: Set connection var ansible_shell_executable to /bin/sh 49116 1727204685.16813: Set connection var ansible_pipelining to False 49116 1727204685.16821: Set connection var ansible_shell_type to sh 49116 1727204685.16836: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204685.17052: variable 'ansible_shell_executable' from source: unknown 49116 1727204685.17056: variable 'ansible_connection' from source: unknown 49116 1727204685.17059: variable 'ansible_module_compression' from source: unknown 49116 1727204685.17061: variable 'ansible_shell_type' from source: unknown 49116 1727204685.17063: variable 'ansible_shell_executable' from source: unknown 49116 1727204685.17067: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204685.17070: variable 'ansible_pipelining' from source: unknown 49116 1727204685.17072: variable 'ansible_timeout' from source: unknown 49116 1727204685.17075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204685.17374: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204685.17410: variable 'omit' from source: magic vars 49116 1727204685.17417: starting attempt loop 49116 1727204685.17420: running the handler 49116 1727204685.17473: handler run complete 49116 1727204685.17501: attempt loop complete, returning result 49116 1727204685.17505: _execute() done 49116 1727204685.17508: dumping result to json 49116 1727204685.17510: done dumping result, returning 49116 1727204685.17521: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [127b8e07-fff9-02f7-957b-000000000315] 49116 1727204685.17526: sending task result for task 127b8e07-fff9-02f7-957b-000000000315 49116 1727204685.17727: done sending task result for task 127b8e07-fff9-02f7-957b-000000000315 49116 1727204685.17733: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 49116 1727204685.17792: no more pending results, returning what we have 49116 1727204685.17795: results queue empty 49116 1727204685.17796: checking for any_errors_fatal 49116 1727204685.17803: done checking for any_errors_fatal 49116 1727204685.17804: checking for max_fail_percentage 49116 1727204685.17806: done checking for max_fail_percentage 49116 1727204685.17807: checking to see if all hosts have failed and the running result is not ok 49116 1727204685.17808: done checking to see if all hosts have failed 49116 1727204685.17809: getting the remaining hosts for this loop 49116 1727204685.17810: done getting the remaining hosts for this loop 49116 1727204685.17815: getting the next task for host managed-node3 49116 1727204685.17824: done getting next task for host managed-node3 49116 1727204685.17828: ^ task is: TASK: Install iproute 49116 1727204685.17831: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204685.17835: getting variables 49116 1727204685.17837: in VariableManager get_vars() 49116 1727204685.17887: Calling all_inventory to load vars for managed-node3 49116 1727204685.17929: Calling groups_inventory to load vars for managed-node3 49116 1727204685.17932: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204685.17945: Calling all_plugins_play to load vars for managed-node3 49116 1727204685.17949: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204685.17952: Calling groups_plugins_play to load vars for managed-node3 49116 1727204685.18328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204685.18562: done with get_vars() 49116 1727204685.18577: done getting variables 49116 1727204685.18640: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 15:04:45 -0400 (0:00:00.057) 0:00:08.211 ***** 49116 1727204685.18678: entering _queue_task() for managed-node3/package 49116 1727204685.19053: worker is 1 (out of 1 available) 49116 1727204685.19071: exiting _queue_task() for managed-node3/package 49116 1727204685.19087: done queuing things up, now waiting for results queue to drain 49116 1727204685.19089: waiting for pending results... 49116 1727204685.19650: running TaskExecutor() for managed-node3/TASK: Install iproute 49116 1727204685.19980: in run() - task 127b8e07-fff9-02f7-957b-00000000021e 49116 1727204685.19984: variable 'ansible_search_path' from source: unknown 49116 1727204685.19986: variable 'ansible_search_path' from source: unknown 49116 1727204685.19989: calling self._execute() 49116 1727204685.19992: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204685.19994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204685.19997: variable 'omit' from source: magic vars 49116 1727204685.20435: variable 'ansible_distribution_major_version' from source: facts 49116 1727204685.20455: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204685.20529: variable 'omit' from source: magic vars 49116 1727204685.20581: variable 'omit' from source: magic vars 49116 1727204685.21030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204685.23956: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204685.24052: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204685.24223: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204685.24268: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204685.24311: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204685.24603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204685.24699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204685.24730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204685.24820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204685.24904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204685.25173: variable '__network_is_ostree' from source: set_fact 49116 1727204685.25176: variable 'omit' from source: magic vars 49116 1727204685.25295: variable 'omit' from source: magic vars 49116 1727204685.25334: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204685.25375: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204685.25496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204685.25526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204685.25543: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204685.25610: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204685.25836: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204685.25839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204685.26055: Set connection var ansible_connection to ssh 49116 1727204685.26058: Set connection var ansible_timeout to 10 49116 1727204685.26060: Set connection var ansible_shell_executable to /bin/sh 49116 1727204685.26063: Set connection var ansible_pipelining to False 49116 1727204685.26067: Set connection var ansible_shell_type to sh 49116 1727204685.26069: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204685.26173: variable 'ansible_shell_executable' from source: unknown 49116 1727204685.26272: variable 'ansible_connection' from source: unknown 49116 1727204685.26276: variable 'ansible_module_compression' from source: unknown 49116 1727204685.26279: variable 'ansible_shell_type' from source: unknown 49116 1727204685.26281: variable 'ansible_shell_executable' from source: unknown 49116 1727204685.26283: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204685.26286: variable 'ansible_pipelining' from source: unknown 49116 1727204685.26288: variable 'ansible_timeout' from source: unknown 49116 1727204685.26290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204685.26390: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204685.26407: variable 'omit' from source: magic vars 49116 1727204685.26418: starting attempt loop 49116 1727204685.26424: running the handler 49116 1727204685.26434: variable 'ansible_facts' from source: unknown 49116 1727204685.26440: variable 'ansible_facts' from source: unknown 49116 1727204685.26598: _low_level_execute_command(): starting 49116 1727204685.26601: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204685.27490: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204685.27552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204685.27581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204685.27688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204685.28127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204685.30082: stdout chunk (state=3): >>>/root <<< 49116 1727204685.30247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204685.30251: stdout chunk (state=3): >>><<< 49116 1727204685.30273: stderr chunk (state=3): >>><<< 49116 1727204685.30369: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204685.30381: _low_level_execute_command(): starting 49116 1727204685.30388: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204685.3036752-49834-245224683042368 `" && echo ansible-tmp-1727204685.3036752-49834-245224683042368="` echo /root/.ansible/tmp/ansible-tmp-1727204685.3036752-49834-245224683042368 `" ) && sleep 0' 49116 1727204685.32133: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204685.32382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204685.32562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204685.32619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204685.32650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204685.32738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204685.34934: stdout chunk (state=3): >>>ansible-tmp-1727204685.3036752-49834-245224683042368=/root/.ansible/tmp/ansible-tmp-1727204685.3036752-49834-245224683042368 <<< 49116 1727204685.35274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204685.35297: stderr chunk (state=3): >>><<< 49116 1727204685.35306: stdout chunk (state=3): >>><<< 49116 1727204685.35343: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204685.3036752-49834-245224683042368=/root/.ansible/tmp/ansible-tmp-1727204685.3036752-49834-245224683042368 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204685.35411: variable 'ansible_module_compression' from source: unknown 49116 1727204685.35645: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 49116 1727204685.35753: ANSIBALLZ: Acquiring lock 49116 1727204685.35756: ANSIBALLZ: Lock acquired: 139720119767104 49116 1727204685.35759: ANSIBALLZ: Creating module 49116 1727204685.65692: ANSIBALLZ: Writing module into payload 49116 1727204685.66146: ANSIBALLZ: Writing module 49116 1727204685.66178: ANSIBALLZ: Renaming module 49116 1727204685.66308: ANSIBALLZ: Done creating module 49116 1727204685.66331: variable 'ansible_facts' from source: unknown 49116 1727204685.66505: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204685.3036752-49834-245224683042368/AnsiballZ_dnf.py 49116 1727204685.67304: Sending initial data 49116 1727204685.67308: Sent initial data (152 bytes) 49116 1727204685.68745: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204685.68750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204685.68928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204685.68938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204685.69084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204685.69141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204685.70928: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204685.70987: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204685.71092: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpnqpxovzc /root/.ansible/tmp/ansible-tmp-1727204685.3036752-49834-245224683042368/AnsiballZ_dnf.py <<< 49116 1727204685.71096: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204685.3036752-49834-245224683042368/AnsiballZ_dnf.py" <<< 49116 1727204685.71184: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpnqpxovzc" to remote "/root/.ansible/tmp/ansible-tmp-1727204685.3036752-49834-245224683042368/AnsiballZ_dnf.py" <<< 49116 1727204685.71193: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204685.3036752-49834-245224683042368/AnsiballZ_dnf.py" <<< 49116 1727204685.75023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204685.75168: stderr chunk (state=3): >>><<< 49116 1727204685.75172: stdout chunk (state=3): >>><<< 49116 1727204685.75188: done transferring module to remote 49116 1727204685.75202: _low_level_execute_command(): starting 49116 1727204685.75211: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204685.3036752-49834-245224683042368/ /root/.ansible/tmp/ansible-tmp-1727204685.3036752-49834-245224683042368/AnsiballZ_dnf.py && sleep 0' 49116 1727204685.76831: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204685.77035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204685.77137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204685.77158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204685.79225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204685.79625: stderr chunk (state=3): >>><<< 49116 1727204685.79629: stdout chunk (state=3): >>><<< 49116 1727204685.79635: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204685.79637: _low_level_execute_command(): starting 49116 1727204685.79640: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204685.3036752-49834-245224683042368/AnsiballZ_dnf.py && sleep 0' 49116 1727204685.81064: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204685.81102: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204685.81181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204685.81311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204685.81387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204685.81406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204685.81527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204685.81643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204687.04294: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 49116 1727204687.09883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204687.09917: stdout chunk (state=3): >>><<< 49116 1727204687.09921: stderr chunk (state=3): >>><<< 49116 1727204687.09944: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204687.10075: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204685.3036752-49834-245224683042368/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204687.10085: _low_level_execute_command(): starting 49116 1727204687.10092: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204685.3036752-49834-245224683042368/ > /dev/null 2>&1 && sleep 0' 49116 1727204687.10707: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204687.10725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204687.10741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204687.10759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204687.10783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204687.10888: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204687.10902: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204687.10954: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204687.11043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204687.13296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204687.13309: stdout chunk (state=3): >>><<< 49116 1727204687.13331: stderr chunk (state=3): >>><<< 49116 1727204687.13358: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204687.13375: handler run complete 49116 1727204687.13595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49116 1727204687.13824: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49116 1727204687.13887: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49116 1727204687.13928: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49116 1727204687.13964: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49116 1727204687.14071: variable '__install_status' from source: unknown 49116 1727204687.14113: Evaluated conditional (__install_status is success): True 49116 1727204687.14170: attempt loop complete, returning result 49116 1727204687.14173: _execute() done 49116 1727204687.14175: dumping result to json 49116 1727204687.14180: done dumping result, returning 49116 1727204687.14182: done running TaskExecutor() for managed-node3/TASK: Install iproute [127b8e07-fff9-02f7-957b-00000000021e] 49116 1727204687.14184: sending task result for task 127b8e07-fff9-02f7-957b-00000000021e 49116 1727204687.14519: done sending task result for task 127b8e07-fff9-02f7-957b-00000000021e 49116 1727204687.14522: WORKER PROCESS EXITING ok: [managed-node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 49116 1727204687.14628: no more pending results, returning what we have 49116 1727204687.14632: results queue empty 49116 1727204687.14633: checking for any_errors_fatal 49116 1727204687.14639: done checking for any_errors_fatal 49116 1727204687.14639: checking for max_fail_percentage 49116 1727204687.14642: done checking for max_fail_percentage 49116 1727204687.14643: checking to see if all hosts have failed and the running result is not ok 49116 1727204687.14644: done checking to see if all hosts have failed 49116 1727204687.14644: getting the remaining hosts for this loop 49116 1727204687.14646: done getting the remaining hosts for this loop 49116 1727204687.14651: getting the next task for host managed-node3 49116 1727204687.14658: done getting next task for host managed-node3 49116 1727204687.14662: ^ task is: TASK: Create veth interface {{ interface }} 49116 1727204687.14666: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204687.14671: getting variables 49116 1727204687.14673: in VariableManager get_vars() 49116 1727204687.14838: Calling all_inventory to load vars for managed-node3 49116 1727204687.14842: Calling groups_inventory to load vars for managed-node3 49116 1727204687.14844: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204687.14857: Calling all_plugins_play to load vars for managed-node3 49116 1727204687.14860: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204687.14863: Calling groups_plugins_play to load vars for managed-node3 49116 1727204687.15367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204687.15637: done with get_vars() 49116 1727204687.15649: done getting variables 49116 1727204687.15720: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204687.15877: variable 'interface' from source: play vars TASK [Create veth interface lsr101] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 15:04:47 -0400 (0:00:01.972) 0:00:10.184 ***** 49116 1727204687.15934: entering _queue_task() for managed-node3/command 49116 1727204687.16691: worker is 1 (out of 1 available) 49116 1727204687.16705: exiting _queue_task() for managed-node3/command 49116 1727204687.16718: done queuing things up, now waiting for results queue to drain 49116 1727204687.16719: waiting for pending results... 49116 1727204687.17300: running TaskExecutor() for managed-node3/TASK: Create veth interface lsr101 49116 1727204687.17428: in run() - task 127b8e07-fff9-02f7-957b-00000000021f 49116 1727204687.17455: variable 'ansible_search_path' from source: unknown 49116 1727204687.17464: variable 'ansible_search_path' from source: unknown 49116 1727204687.18079: variable 'interface' from source: play vars 49116 1727204687.18147: variable 'interface' from source: play vars 49116 1727204687.18379: variable 'interface' from source: play vars 49116 1727204687.18840: Loaded config def from plugin (lookup/items) 49116 1727204687.18847: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 49116 1727204687.18876: variable 'omit' from source: magic vars 49116 1727204687.19232: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204687.19259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204687.19310: variable 'omit' from source: magic vars 49116 1727204687.19627: variable 'ansible_distribution_major_version' from source: facts 49116 1727204687.19648: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204687.19913: variable 'type' from source: play vars 49116 1727204687.19922: variable 'state' from source: include params 49116 1727204687.19942: variable 'interface' from source: play vars 49116 1727204687.20048: variable 'current_interfaces' from source: set_fact 49116 1727204687.20052: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 49116 1727204687.20055: variable 'omit' from source: magic vars 49116 1727204687.20058: variable 'omit' from source: magic vars 49116 1727204687.20091: variable 'item' from source: unknown 49116 1727204687.20193: variable 'item' from source: unknown 49116 1727204687.20215: variable 'omit' from source: magic vars 49116 1727204687.20269: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204687.20314: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204687.20342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204687.20370: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204687.20390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204687.20433: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204687.20442: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204687.20452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204687.20576: Set connection var ansible_connection to ssh 49116 1727204687.20615: Set connection var ansible_timeout to 10 49116 1727204687.20618: Set connection var ansible_shell_executable to /bin/sh 49116 1727204687.20626: Set connection var ansible_pipelining to False 49116 1727204687.20701: Set connection var ansible_shell_type to sh 49116 1727204687.20704: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204687.20707: variable 'ansible_shell_executable' from source: unknown 49116 1727204687.20709: variable 'ansible_connection' from source: unknown 49116 1727204687.20711: variable 'ansible_module_compression' from source: unknown 49116 1727204687.20713: variable 'ansible_shell_type' from source: unknown 49116 1727204687.20716: variable 'ansible_shell_executable' from source: unknown 49116 1727204687.20722: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204687.20724: variable 'ansible_pipelining' from source: unknown 49116 1727204687.20726: variable 'ansible_timeout' from source: unknown 49116 1727204687.20728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204687.20888: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204687.20907: variable 'omit' from source: magic vars 49116 1727204687.20923: starting attempt loop 49116 1727204687.20930: running the handler 49116 1727204687.20955: _low_level_execute_command(): starting 49116 1727204687.20970: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204687.21934: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204687.21976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204687.22034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204687.22037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204687.22152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204687.24174: stdout chunk (state=3): >>>/root <<< 49116 1727204687.24312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204687.24351: stdout chunk (state=3): >>><<< 49116 1727204687.24355: stderr chunk (state=3): >>><<< 49116 1727204687.24543: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204687.24548: _low_level_execute_command(): starting 49116 1727204687.24551: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204687.243801-49950-234676398376659 `" && echo ansible-tmp-1727204687.243801-49950-234676398376659="` echo /root/.ansible/tmp/ansible-tmp-1727204687.243801-49950-234676398376659 `" ) && sleep 0' 49116 1727204687.25833: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204687.25848: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204687.25859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204687.25879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204687.25892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204687.26090: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204687.26094: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204687.26097: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204687.26197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204687.26298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204687.28551: stdout chunk (state=3): >>>ansible-tmp-1727204687.243801-49950-234676398376659=/root/.ansible/tmp/ansible-tmp-1727204687.243801-49950-234676398376659 <<< 49116 1727204687.29789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204687.29794: stdout chunk (state=3): >>><<< 49116 1727204687.29797: stderr chunk (state=3): >>><<< 49116 1727204687.29799: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204687.243801-49950-234676398376659=/root/.ansible/tmp/ansible-tmp-1727204687.243801-49950-234676398376659 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204687.29802: variable 'ansible_module_compression' from source: unknown 49116 1727204687.29804: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 49116 1727204687.29807: variable 'ansible_facts' from source: unknown 49116 1727204687.30538: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204687.243801-49950-234676398376659/AnsiballZ_command.py 49116 1727204687.31121: Sending initial data 49116 1727204687.31125: Sent initial data (155 bytes) 49116 1727204687.32286: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204687.32298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204687.32592: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204687.32789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204687.32821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204687.34707: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204687.34805: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204687.34959: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmp84og2m8e /root/.ansible/tmp/ansible-tmp-1727204687.243801-49950-234676398376659/AnsiballZ_command.py <<< 49116 1727204687.35011: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204687.243801-49950-234676398376659/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmp84og2m8e" to remote "/root/.ansible/tmp/ansible-tmp-1727204687.243801-49950-234676398376659/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204687.243801-49950-234676398376659/AnsiballZ_command.py" <<< 49116 1727204687.37295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204687.37577: stderr chunk (state=3): >>><<< 49116 1727204687.37586: stdout chunk (state=3): >>><<< 49116 1727204687.37589: done transferring module to remote 49116 1727204687.37591: _low_level_execute_command(): starting 49116 1727204687.37594: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204687.243801-49950-234676398376659/ /root/.ansible/tmp/ansible-tmp-1727204687.243801-49950-234676398376659/AnsiballZ_command.py && sleep 0' 49116 1727204687.38893: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204687.38930: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204687.38934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204687.38937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204687.38945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204687.38948: stderr chunk (state=3): >>>debug2: match not found <<< 49116 1727204687.38959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204687.39026: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 49116 1727204687.39371: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204687.39402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204687.39772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204687.42181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204687.42186: stdout chunk (state=3): >>><<< 49116 1727204687.42191: stderr chunk (state=3): >>><<< 49116 1727204687.42211: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204687.42214: _low_level_execute_command(): starting 49116 1727204687.42220: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204687.243801-49950-234676398376659/AnsiballZ_command.py && sleep 0' 49116 1727204687.43561: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204687.43568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204687.43661: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204687.43671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204687.43684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204687.43890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204687.43894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204687.62481: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr101", "type", "veth", "peer", "name", "peerlsr101"], "start": "2024-09-24 15:04:47.614873", "end": "2024-09-24 15:04:47.621537", "delta": "0:00:00.006664", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr101 type veth peer name peerlsr101", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49116 1727204687.65597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204687.65697: stderr chunk (state=3): >>><<< 49116 1727204687.65707: stdout chunk (state=3): >>><<< 49116 1727204687.65734: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr101", "type", "veth", "peer", "name", "peerlsr101"], "start": "2024-09-24 15:04:47.614873", "end": "2024-09-24 15:04:47.621537", "delta": "0:00:00.006664", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr101 type veth peer name peerlsr101", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204687.65786: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add lsr101 type veth peer name peerlsr101', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204687.243801-49950-234676398376659/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204687.65803: _low_level_execute_command(): starting 49116 1727204687.66053: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204687.243801-49950-234676398376659/ > /dev/null 2>&1 && sleep 0' 49116 1727204687.67493: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204687.67792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204687.67930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204687.68038: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204687.68207: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204687.68328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204687.73678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204687.73682: stdout chunk (state=3): >>><<< 49116 1727204687.73685: stderr chunk (state=3): >>><<< 49116 1727204687.73712: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204687.73783: handler run complete 49116 1727204687.73811: Evaluated conditional (False): False 49116 1727204687.73840: attempt loop complete, returning result 49116 1727204687.74075: variable 'item' from source: unknown 49116 1727204687.74094: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link add lsr101 type veth peer name peerlsr101) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "lsr101", "type", "veth", "peer", "name", "peerlsr101" ], "delta": "0:00:00.006664", "end": "2024-09-24 15:04:47.621537", "item": "ip link add lsr101 type veth peer name peerlsr101", "rc": 0, "start": "2024-09-24 15:04:47.614873" } 49116 1727204687.74454: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204687.74458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204687.74461: variable 'omit' from source: magic vars 49116 1727204687.74627: variable 'ansible_distribution_major_version' from source: facts 49116 1727204687.74633: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204687.75036: variable 'type' from source: play vars 49116 1727204687.75064: variable 'state' from source: include params 49116 1727204687.75067: variable 'interface' from source: play vars 49116 1727204687.75071: variable 'current_interfaces' from source: set_fact 49116 1727204687.75073: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 49116 1727204687.75075: variable 'omit' from source: magic vars 49116 1727204687.75221: variable 'omit' from source: magic vars 49116 1727204687.75229: variable 'item' from source: unknown 49116 1727204687.75304: variable 'item' from source: unknown 49116 1727204687.75319: variable 'omit' from source: magic vars 49116 1727204687.75370: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204687.75373: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204687.75375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204687.75581: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204687.75584: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204687.75587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204687.75681: Set connection var ansible_connection to ssh 49116 1727204687.75694: Set connection var ansible_timeout to 10 49116 1727204687.75703: Set connection var ansible_shell_executable to /bin/sh 49116 1727204687.75707: Set connection var ansible_pipelining to False 49116 1727204687.75710: Set connection var ansible_shell_type to sh 49116 1727204687.75768: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204687.75771: variable 'ansible_shell_executable' from source: unknown 49116 1727204687.75773: variable 'ansible_connection' from source: unknown 49116 1727204687.75775: variable 'ansible_module_compression' from source: unknown 49116 1727204687.75777: variable 'ansible_shell_type' from source: unknown 49116 1727204687.75779: variable 'ansible_shell_executable' from source: unknown 49116 1727204687.75781: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204687.75783: variable 'ansible_pipelining' from source: unknown 49116 1727204687.75784: variable 'ansible_timeout' from source: unknown 49116 1727204687.75786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204687.76006: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204687.76051: variable 'omit' from source: magic vars 49116 1727204687.76055: starting attempt loop 49116 1727204687.76058: running the handler 49116 1727204687.76060: _low_level_execute_command(): starting 49116 1727204687.76062: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204687.77505: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204687.77535: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204687.77554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204687.77782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204687.77805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204687.77882: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204687.77945: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204687.77957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204687.78046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204687.79907: stdout chunk (state=3): >>>/root <<< 49116 1727204687.80275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204687.80279: stdout chunk (state=3): >>><<< 49116 1727204687.80281: stderr chunk (state=3): >>><<< 49116 1727204687.80284: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204687.80286: _low_level_execute_command(): starting 49116 1727204687.80288: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204687.8024473-49950-274819846531495 `" && echo ansible-tmp-1727204687.8024473-49950-274819846531495="` echo /root/.ansible/tmp/ansible-tmp-1727204687.8024473-49950-274819846531495 `" ) && sleep 0' 49116 1727204687.81445: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204687.81462: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204687.81480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204687.81585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204687.81613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204687.81642: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204687.81657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204687.81764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204687.83904: stdout chunk (state=3): >>>ansible-tmp-1727204687.8024473-49950-274819846531495=/root/.ansible/tmp/ansible-tmp-1727204687.8024473-49950-274819846531495 <<< 49116 1727204687.84280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204687.84284: stdout chunk (state=3): >>><<< 49116 1727204687.84287: stderr chunk (state=3): >>><<< 49116 1727204687.84290: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204687.8024473-49950-274819846531495=/root/.ansible/tmp/ansible-tmp-1727204687.8024473-49950-274819846531495 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204687.84293: variable 'ansible_module_compression' from source: unknown 49116 1727204687.84295: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 49116 1727204687.84297: variable 'ansible_facts' from source: unknown 49116 1727204687.84327: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204687.8024473-49950-274819846531495/AnsiballZ_command.py 49116 1727204687.84513: Sending initial data 49116 1727204687.84537: Sent initial data (156 bytes) 49116 1727204687.85268: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204687.85293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204687.85445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204687.85522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204687.87292: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204687.87361: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204687.87434: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpks0h85r6 /root/.ansible/tmp/ansible-tmp-1727204687.8024473-49950-274819846531495/AnsiballZ_command.py <<< 49116 1727204687.87437: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204687.8024473-49950-274819846531495/AnsiballZ_command.py" <<< 49116 1727204687.87498: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpks0h85r6" to remote "/root/.ansible/tmp/ansible-tmp-1727204687.8024473-49950-274819846531495/AnsiballZ_command.py" <<< 49116 1727204687.87501: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204687.8024473-49950-274819846531495/AnsiballZ_command.py" <<< 49116 1727204687.88251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204687.88284: stderr chunk (state=3): >>><<< 49116 1727204687.88290: stdout chunk (state=3): >>><<< 49116 1727204687.88308: done transferring module to remote 49116 1727204687.88316: _low_level_execute_command(): starting 49116 1727204687.88321: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204687.8024473-49950-274819846531495/ /root/.ansible/tmp/ansible-tmp-1727204687.8024473-49950-274819846531495/AnsiballZ_command.py && sleep 0' 49116 1727204687.89001: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204687.89029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204687.89140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204687.91150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204687.91218: stderr chunk (state=3): >>><<< 49116 1727204687.91222: stdout chunk (state=3): >>><<< 49116 1727204687.91238: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204687.91241: _low_level_execute_command(): starting 49116 1727204687.91246: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204687.8024473-49950-274819846531495/AnsiballZ_command.py && sleep 0' 49116 1727204687.91772: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204687.91776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49116 1727204687.91778: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204687.91781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204687.91836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204687.91840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204687.91928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204688.10036: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr101", "up"], "start": "2024-09-24 15:04:48.093646", "end": "2024-09-24 15:04:48.098326", "delta": "0:00:00.004680", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49116 1727204688.11930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204688.12143: stderr chunk (state=3): >>><<< 49116 1727204688.12147: stdout chunk (state=3): >>><<< 49116 1727204688.12150: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr101", "up"], "start": "2024-09-24 15:04:48.093646", "end": "2024-09-24 15:04:48.098326", "delta": "0:00:00.004680", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204688.12158: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerlsr101 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204687.8024473-49950-274819846531495/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204688.12161: _low_level_execute_command(): starting 49116 1727204688.12163: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204687.8024473-49950-274819846531495/ > /dev/null 2>&1 && sleep 0' 49116 1727204688.13493: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204688.13599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204688.13687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204688.15893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204688.16173: stderr chunk (state=3): >>><<< 49116 1727204688.16177: stdout chunk (state=3): >>><<< 49116 1727204688.16180: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204688.16182: handler run complete 49116 1727204688.16184: Evaluated conditional (False): False 49116 1727204688.16186: attempt loop complete, returning result 49116 1727204688.16208: variable 'item' from source: unknown 49116 1727204688.16573: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link set peerlsr101 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerlsr101", "up" ], "delta": "0:00:00.004680", "end": "2024-09-24 15:04:48.098326", "item": "ip link set peerlsr101 up", "rc": 0, "start": "2024-09-24 15:04:48.093646" } 49116 1727204688.16976: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204688.16980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204688.16983: variable 'omit' from source: magic vars 49116 1727204688.17212: variable 'ansible_distribution_major_version' from source: facts 49116 1727204688.17224: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204688.17738: variable 'type' from source: play vars 49116 1727204688.17772: variable 'state' from source: include params 49116 1727204688.17776: variable 'interface' from source: play vars 49116 1727204688.17778: variable 'current_interfaces' from source: set_fact 49116 1727204688.17845: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 49116 1727204688.17849: variable 'omit' from source: magic vars 49116 1727204688.17851: variable 'omit' from source: magic vars 49116 1727204688.18173: variable 'item' from source: unknown 49116 1727204688.18176: variable 'item' from source: unknown 49116 1727204688.18199: variable 'omit' from source: magic vars 49116 1727204688.18230: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204688.18246: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204688.18257: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204688.18285: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204688.18293: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204688.18300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204688.18523: Set connection var ansible_connection to ssh 49116 1727204688.18542: Set connection var ansible_timeout to 10 49116 1727204688.18555: Set connection var ansible_shell_executable to /bin/sh 49116 1727204688.18569: Set connection var ansible_pipelining to False 49116 1727204688.18577: Set connection var ansible_shell_type to sh 49116 1727204688.18587: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204688.18620: variable 'ansible_shell_executable' from source: unknown 49116 1727204688.18677: variable 'ansible_connection' from source: unknown 49116 1727204688.18685: variable 'ansible_module_compression' from source: unknown 49116 1727204688.18693: variable 'ansible_shell_type' from source: unknown 49116 1727204688.18700: variable 'ansible_shell_executable' from source: unknown 49116 1727204688.18707: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204688.18720: variable 'ansible_pipelining' from source: unknown 49116 1727204688.18727: variable 'ansible_timeout' from source: unknown 49116 1727204688.18872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204688.19148: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204688.19152: variable 'omit' from source: magic vars 49116 1727204688.19154: starting attempt loop 49116 1727204688.19156: running the handler 49116 1727204688.19158: _low_level_execute_command(): starting 49116 1727204688.19167: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204688.20364: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204688.20577: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204688.20811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204688.20879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204688.22690: stdout chunk (state=3): >>>/root <<< 49116 1727204688.22860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204688.23059: stderr chunk (state=3): >>><<< 49116 1727204688.23063: stdout chunk (state=3): >>><<< 49116 1727204688.23068: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204688.23071: _low_level_execute_command(): starting 49116 1727204688.23074: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204688.230069-49950-276887341951515 `" && echo ansible-tmp-1727204688.230069-49950-276887341951515="` echo /root/.ansible/tmp/ansible-tmp-1727204688.230069-49950-276887341951515 `" ) && sleep 0' 49116 1727204688.24598: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204688.24719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204688.24772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204688.24896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204688.27041: stdout chunk (state=3): >>>ansible-tmp-1727204688.230069-49950-276887341951515=/root/.ansible/tmp/ansible-tmp-1727204688.230069-49950-276887341951515 <<< 49116 1727204688.27523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204688.27527: stdout chunk (state=3): >>><<< 49116 1727204688.27529: stderr chunk (state=3): >>><<< 49116 1727204688.27532: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204688.230069-49950-276887341951515=/root/.ansible/tmp/ansible-tmp-1727204688.230069-49950-276887341951515 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204688.27535: variable 'ansible_module_compression' from source: unknown 49116 1727204688.27537: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 49116 1727204688.27539: variable 'ansible_facts' from source: unknown 49116 1727204688.27768: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204688.230069-49950-276887341951515/AnsiballZ_command.py 49116 1727204688.28044: Sending initial data 49116 1727204688.28058: Sent initial data (155 bytes) 49116 1727204688.29659: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204688.29794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204688.29909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204688.31677: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204688.31776: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204688.31864: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmps9n9z1lt /root/.ansible/tmp/ansible-tmp-1727204688.230069-49950-276887341951515/AnsiballZ_command.py <<< 49116 1727204688.31870: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204688.230069-49950-276887341951515/AnsiballZ_command.py" <<< 49116 1727204688.32159: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmps9n9z1lt" to remote "/root/.ansible/tmp/ansible-tmp-1727204688.230069-49950-276887341951515/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204688.230069-49950-276887341951515/AnsiballZ_command.py" <<< 49116 1727204688.33328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204688.33406: stderr chunk (state=3): >>><<< 49116 1727204688.33450: stdout chunk (state=3): >>><<< 49116 1727204688.33477: done transferring module to remote 49116 1727204688.33486: _low_level_execute_command(): starting 49116 1727204688.33491: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204688.230069-49950-276887341951515/ /root/.ansible/tmp/ansible-tmp-1727204688.230069-49950-276887341951515/AnsiballZ_command.py && sleep 0' 49116 1727204688.34874: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204688.34883: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204688.35092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204688.35176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204688.37355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204688.37389: stderr chunk (state=3): >>><<< 49116 1727204688.37393: stdout chunk (state=3): >>><<< 49116 1727204688.37512: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204688.37516: _low_level_execute_command(): starting 49116 1727204688.37519: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204688.230069-49950-276887341951515/AnsiballZ_command.py && sleep 0' 49116 1727204688.38992: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204688.39134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204688.39163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204688.39322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204688.57312: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr101", "up"], "start": "2024-09-24 15:04:48.567014", "end": "2024-09-24 15:04:48.571179", "delta": "0:00:00.004165", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49116 1727204688.59233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204688.59239: stdout chunk (state=3): >>><<< 49116 1727204688.59241: stderr chunk (state=3): >>><<< 49116 1727204688.59397: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr101", "up"], "start": "2024-09-24 15:04:48.567014", "end": "2024-09-24 15:04:48.571179", "delta": "0:00:00.004165", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204688.59423: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set lsr101 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204688.230069-49950-276887341951515/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204688.59446: _low_level_execute_command(): starting 49116 1727204688.59449: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204688.230069-49950-276887341951515/ > /dev/null 2>&1 && sleep 0' 49116 1727204688.60758: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204688.60763: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204688.60768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204688.60771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204688.60773: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204688.60775: stderr chunk (state=3): >>>debug2: match not found <<< 49116 1727204688.60778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204688.60972: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204688.61055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204688.61244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204688.63210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204688.63384: stderr chunk (state=3): >>><<< 49116 1727204688.63388: stdout chunk (state=3): >>><<< 49116 1727204688.63390: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204688.63393: handler run complete 49116 1727204688.63395: Evaluated conditional (False): False 49116 1727204688.63397: attempt loop complete, returning result 49116 1727204688.63408: variable 'item' from source: unknown 49116 1727204688.63494: variable 'item' from source: unknown ok: [managed-node3] => (item=ip link set lsr101 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "lsr101", "up" ], "delta": "0:00:00.004165", "end": "2024-09-24 15:04:48.571179", "item": "ip link set lsr101 up", "rc": 0, "start": "2024-09-24 15:04:48.567014" } 49116 1727204688.63743: dumping result to json 49116 1727204688.63745: done dumping result, returning 49116 1727204688.63748: done running TaskExecutor() for managed-node3/TASK: Create veth interface lsr101 [127b8e07-fff9-02f7-957b-00000000021f] 49116 1727204688.63750: sending task result for task 127b8e07-fff9-02f7-957b-00000000021f 49116 1727204688.63910: no more pending results, returning what we have 49116 1727204688.63914: results queue empty 49116 1727204688.63915: checking for any_errors_fatal 49116 1727204688.63919: done checking for any_errors_fatal 49116 1727204688.63920: checking for max_fail_percentage 49116 1727204688.63921: done checking for max_fail_percentage 49116 1727204688.63922: checking to see if all hosts have failed and the running result is not ok 49116 1727204688.63923: done checking to see if all hosts have failed 49116 1727204688.63924: getting the remaining hosts for this loop 49116 1727204688.63925: done getting the remaining hosts for this loop 49116 1727204688.63929: getting the next task for host managed-node3 49116 1727204688.63934: done getting next task for host managed-node3 49116 1727204688.63937: ^ task is: TASK: Set up veth as managed by NetworkManager 49116 1727204688.63940: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204688.63962: getting variables 49116 1727204688.63963: in VariableManager get_vars() 49116 1727204688.64008: Calling all_inventory to load vars for managed-node3 49116 1727204688.64011: Calling groups_inventory to load vars for managed-node3 49116 1727204688.64014: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204688.64021: done sending task result for task 127b8e07-fff9-02f7-957b-00000000021f 49116 1727204688.64024: WORKER PROCESS EXITING 49116 1727204688.64038: Calling all_plugins_play to load vars for managed-node3 49116 1727204688.64041: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204688.64044: Calling groups_plugins_play to load vars for managed-node3 49116 1727204688.64280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204688.64501: done with get_vars() 49116 1727204688.64514: done getting variables 49116 1727204688.64575: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 15:04:48 -0400 (0:00:01.486) 0:00:11.671 ***** 49116 1727204688.64609: entering _queue_task() for managed-node3/command 49116 1727204688.64948: worker is 1 (out of 1 available) 49116 1727204688.64962: exiting _queue_task() for managed-node3/command 49116 1727204688.65178: done queuing things up, now waiting for results queue to drain 49116 1727204688.65180: waiting for pending results... 49116 1727204688.65277: running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager 49116 1727204688.65413: in run() - task 127b8e07-fff9-02f7-957b-000000000220 49116 1727204688.65515: variable 'ansible_search_path' from source: unknown 49116 1727204688.65519: variable 'ansible_search_path' from source: unknown 49116 1727204688.65522: calling self._execute() 49116 1727204688.65596: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204688.65608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204688.65629: variable 'omit' from source: magic vars 49116 1727204688.66036: variable 'ansible_distribution_major_version' from source: facts 49116 1727204688.66059: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204688.66242: variable 'type' from source: play vars 49116 1727204688.66254: variable 'state' from source: include params 49116 1727204688.66263: Evaluated conditional (type == 'veth' and state == 'present'): True 49116 1727204688.66278: variable 'omit' from source: magic vars 49116 1727204688.66323: variable 'omit' from source: magic vars 49116 1727204688.66443: variable 'interface' from source: play vars 49116 1727204688.66491: variable 'omit' from source: magic vars 49116 1727204688.66521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204688.66573: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204688.66603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204688.66707: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204688.66711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204688.66713: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204688.66715: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204688.66717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204688.66806: Set connection var ansible_connection to ssh 49116 1727204688.66834: Set connection var ansible_timeout to 10 49116 1727204688.66849: Set connection var ansible_shell_executable to /bin/sh 49116 1727204688.66858: Set connection var ansible_pipelining to False 49116 1727204688.66867: Set connection var ansible_shell_type to sh 49116 1727204688.66877: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204688.66905: variable 'ansible_shell_executable' from source: unknown 49116 1727204688.66914: variable 'ansible_connection' from source: unknown 49116 1727204688.66924: variable 'ansible_module_compression' from source: unknown 49116 1727204688.66931: variable 'ansible_shell_type' from source: unknown 49116 1727204688.66940: variable 'ansible_shell_executable' from source: unknown 49116 1727204688.66947: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204688.66954: variable 'ansible_pipelining' from source: unknown 49116 1727204688.66960: variable 'ansible_timeout' from source: unknown 49116 1727204688.66970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204688.67141: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204688.67146: variable 'omit' from source: magic vars 49116 1727204688.67157: starting attempt loop 49116 1727204688.67249: running the handler 49116 1727204688.67252: _low_level_execute_command(): starting 49116 1727204688.67255: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204688.68008: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204688.68035: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204688.68051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204688.68074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204688.68091: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204688.68139: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204688.68209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204688.68236: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204688.68288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204688.68370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204688.70221: stdout chunk (state=3): >>>/root <<< 49116 1727204688.70437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204688.70442: stdout chunk (state=3): >>><<< 49116 1727204688.70444: stderr chunk (state=3): >>><<< 49116 1727204688.70605: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204688.70609: _low_level_execute_command(): starting 49116 1727204688.70613: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204688.7049594-50035-209903920678390 `" && echo ansible-tmp-1727204688.7049594-50035-209903920678390="` echo /root/.ansible/tmp/ansible-tmp-1727204688.7049594-50035-209903920678390 `" ) && sleep 0' 49116 1727204688.71430: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204688.71522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204688.71592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204688.71624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204688.71642: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204688.71785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204688.73935: stdout chunk (state=3): >>>ansible-tmp-1727204688.7049594-50035-209903920678390=/root/.ansible/tmp/ansible-tmp-1727204688.7049594-50035-209903920678390 <<< 49116 1727204688.74291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204688.74297: stdout chunk (state=3): >>><<< 49116 1727204688.74300: stderr chunk (state=3): >>><<< 49116 1727204688.74304: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204688.7049594-50035-209903920678390=/root/.ansible/tmp/ansible-tmp-1727204688.7049594-50035-209903920678390 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204688.74307: variable 'ansible_module_compression' from source: unknown 49116 1727204688.74319: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 49116 1727204688.74362: variable 'ansible_facts' from source: unknown 49116 1727204688.74460: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204688.7049594-50035-209903920678390/AnsiballZ_command.py 49116 1727204688.74736: Sending initial data 49116 1727204688.74740: Sent initial data (156 bytes) 49116 1727204688.75295: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204688.75300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204688.75324: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204688.75328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204688.75395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204688.75399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204688.75478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204688.77273: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204688.77335: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204688.77426: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmp2uvewml1 /root/.ansible/tmp/ansible-tmp-1727204688.7049594-50035-209903920678390/AnsiballZ_command.py <<< 49116 1727204688.77430: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204688.7049594-50035-209903920678390/AnsiballZ_command.py" <<< 49116 1727204688.77485: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmp2uvewml1" to remote "/root/.ansible/tmp/ansible-tmp-1727204688.7049594-50035-209903920678390/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204688.7049594-50035-209903920678390/AnsiballZ_command.py" <<< 49116 1727204688.82690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204688.82860: stderr chunk (state=3): >>><<< 49116 1727204688.82864: stdout chunk (state=3): >>><<< 49116 1727204688.82869: done transferring module to remote 49116 1727204688.82871: _low_level_execute_command(): starting 49116 1727204688.82873: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204688.7049594-50035-209903920678390/ /root/.ansible/tmp/ansible-tmp-1727204688.7049594-50035-209903920678390/AnsiballZ_command.py && sleep 0' 49116 1727204688.83346: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204688.83350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204688.83353: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204688.83360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204688.83405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204688.83408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204688.83493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204688.85590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204688.85594: stdout chunk (state=3): >>><<< 49116 1727204688.85597: stderr chunk (state=3): >>><<< 49116 1727204688.85710: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204688.85714: _low_level_execute_command(): starting 49116 1727204688.85716: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204688.7049594-50035-209903920678390/AnsiballZ_command.py && sleep 0' 49116 1727204688.86433: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204688.86438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49116 1727204688.86442: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 49116 1727204688.86445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204688.86461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204688.86545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204689.06173: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr101", "managed", "true"], "start": "2024-09-24 15:04:49.040969", "end": "2024-09-24 15:04:49.060375", "delta": "0:00:00.019406", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr101 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49116 1727204689.08203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204689.08251: stderr chunk (state=3): >>><<< 49116 1727204689.08263: stdout chunk (state=3): >>><<< 49116 1727204689.08304: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr101", "managed", "true"], "start": "2024-09-24 15:04:49.040969", "end": "2024-09-24 15:04:49.060375", "delta": "0:00:00.019406", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr101 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204689.08350: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set lsr101 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204688.7049594-50035-209903920678390/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204689.08406: _low_level_execute_command(): starting 49116 1727204689.08410: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204688.7049594-50035-209903920678390/ > /dev/null 2>&1 && sleep 0' 49116 1727204689.09101: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204689.09195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204689.09259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204689.09293: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204689.09321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204689.09431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204689.11562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204689.11572: stdout chunk (state=3): >>><<< 49116 1727204689.11576: stderr chunk (state=3): >>><<< 49116 1727204689.11595: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204689.11772: handler run complete 49116 1727204689.11776: Evaluated conditional (False): False 49116 1727204689.11778: attempt loop complete, returning result 49116 1727204689.11780: _execute() done 49116 1727204689.11781: dumping result to json 49116 1727204689.11783: done dumping result, returning 49116 1727204689.11785: done running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager [127b8e07-fff9-02f7-957b-000000000220] 49116 1727204689.11787: sending task result for task 127b8e07-fff9-02f7-957b-000000000220 49116 1727204689.11858: done sending task result for task 127b8e07-fff9-02f7-957b-000000000220 49116 1727204689.11861: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "lsr101", "managed", "true" ], "delta": "0:00:00.019406", "end": "2024-09-24 15:04:49.060375", "rc": 0, "start": "2024-09-24 15:04:49.040969" } 49116 1727204689.11931: no more pending results, returning what we have 49116 1727204689.11934: results queue empty 49116 1727204689.11936: checking for any_errors_fatal 49116 1727204689.11948: done checking for any_errors_fatal 49116 1727204689.11948: checking for max_fail_percentage 49116 1727204689.11950: done checking for max_fail_percentage 49116 1727204689.11951: checking to see if all hosts have failed and the running result is not ok 49116 1727204689.11952: done checking to see if all hosts have failed 49116 1727204689.11953: getting the remaining hosts for this loop 49116 1727204689.11954: done getting the remaining hosts for this loop 49116 1727204689.11959: getting the next task for host managed-node3 49116 1727204689.11976: done getting next task for host managed-node3 49116 1727204689.11980: ^ task is: TASK: Delete veth interface {{ interface }} 49116 1727204689.11984: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204689.11988: getting variables 49116 1727204689.11990: in VariableManager get_vars() 49116 1727204689.12044: Calling all_inventory to load vars for managed-node3 49116 1727204689.12047: Calling groups_inventory to load vars for managed-node3 49116 1727204689.12049: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204689.12171: Calling all_plugins_play to load vars for managed-node3 49116 1727204689.12177: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204689.12182: Calling groups_plugins_play to load vars for managed-node3 49116 1727204689.12686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204689.12968: done with get_vars() 49116 1727204689.12984: done getting variables 49116 1727204689.13066: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204689.13212: variable 'interface' from source: play vars TASK [Delete veth interface lsr101] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 15:04:49 -0400 (0:00:00.486) 0:00:12.157 ***** 49116 1727204689.13253: entering _queue_task() for managed-node3/command 49116 1727204689.13819: worker is 1 (out of 1 available) 49116 1727204689.13833: exiting _queue_task() for managed-node3/command 49116 1727204689.13845: done queuing things up, now waiting for results queue to drain 49116 1727204689.13846: waiting for pending results... 49116 1727204689.14092: running TaskExecutor() for managed-node3/TASK: Delete veth interface lsr101 49116 1727204689.14189: in run() - task 127b8e07-fff9-02f7-957b-000000000221 49116 1727204689.14193: variable 'ansible_search_path' from source: unknown 49116 1727204689.14195: variable 'ansible_search_path' from source: unknown 49116 1727204689.14201: calling self._execute() 49116 1727204689.14313: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204689.14324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204689.14341: variable 'omit' from source: magic vars 49116 1727204689.14787: variable 'ansible_distribution_major_version' from source: facts 49116 1727204689.14808: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204689.15081: variable 'type' from source: play vars 49116 1727204689.15094: variable 'state' from source: include params 49116 1727204689.15173: variable 'interface' from source: play vars 49116 1727204689.15177: variable 'current_interfaces' from source: set_fact 49116 1727204689.15180: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 49116 1727204689.15183: when evaluation is False, skipping this task 49116 1727204689.15185: _execute() done 49116 1727204689.15188: dumping result to json 49116 1727204689.15190: done dumping result, returning 49116 1727204689.15193: done running TaskExecutor() for managed-node3/TASK: Delete veth interface lsr101 [127b8e07-fff9-02f7-957b-000000000221] 49116 1727204689.15195: sending task result for task 127b8e07-fff9-02f7-957b-000000000221 49116 1727204689.15499: done sending task result for task 127b8e07-fff9-02f7-957b-000000000221 49116 1727204689.15503: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 49116 1727204689.15551: no more pending results, returning what we have 49116 1727204689.15554: results queue empty 49116 1727204689.15556: checking for any_errors_fatal 49116 1727204689.15563: done checking for any_errors_fatal 49116 1727204689.15564: checking for max_fail_percentage 49116 1727204689.15567: done checking for max_fail_percentage 49116 1727204689.15568: checking to see if all hosts have failed and the running result is not ok 49116 1727204689.15569: done checking to see if all hosts have failed 49116 1727204689.15570: getting the remaining hosts for this loop 49116 1727204689.15571: done getting the remaining hosts for this loop 49116 1727204689.15576: getting the next task for host managed-node3 49116 1727204689.15581: done getting next task for host managed-node3 49116 1727204689.15584: ^ task is: TASK: Create dummy interface {{ interface }} 49116 1727204689.15587: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204689.15591: getting variables 49116 1727204689.15592: in VariableManager get_vars() 49116 1727204689.15638: Calling all_inventory to load vars for managed-node3 49116 1727204689.15641: Calling groups_inventory to load vars for managed-node3 49116 1727204689.15643: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204689.15656: Calling all_plugins_play to load vars for managed-node3 49116 1727204689.15659: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204689.15663: Calling groups_plugins_play to load vars for managed-node3 49116 1727204689.16054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204689.16316: done with get_vars() 49116 1727204689.16339: done getting variables 49116 1727204689.16412: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204689.16555: variable 'interface' from source: play vars TASK [Create dummy interface lsr101] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 15:04:49 -0400 (0:00:00.033) 0:00:12.190 ***** 49116 1727204689.16593: entering _queue_task() for managed-node3/command 49116 1727204689.17096: worker is 1 (out of 1 available) 49116 1727204689.17110: exiting _queue_task() for managed-node3/command 49116 1727204689.17121: done queuing things up, now waiting for results queue to drain 49116 1727204689.17123: waiting for pending results... 49116 1727204689.17390: running TaskExecutor() for managed-node3/TASK: Create dummy interface lsr101 49116 1727204689.17533: in run() - task 127b8e07-fff9-02f7-957b-000000000222 49116 1727204689.17537: variable 'ansible_search_path' from source: unknown 49116 1727204689.17540: variable 'ansible_search_path' from source: unknown 49116 1727204689.17581: calling self._execute() 49116 1727204689.17698: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204689.17749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204689.17752: variable 'omit' from source: magic vars 49116 1727204689.18186: variable 'ansible_distribution_major_version' from source: facts 49116 1727204689.18212: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204689.18517: variable 'type' from source: play vars 49116 1727204689.18523: variable 'state' from source: include params 49116 1727204689.18526: variable 'interface' from source: play vars 49116 1727204689.18528: variable 'current_interfaces' from source: set_fact 49116 1727204689.18532: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 49116 1727204689.18534: when evaluation is False, skipping this task 49116 1727204689.18537: _execute() done 49116 1727204689.18539: dumping result to json 49116 1727204689.18541: done dumping result, returning 49116 1727204689.18550: done running TaskExecutor() for managed-node3/TASK: Create dummy interface lsr101 [127b8e07-fff9-02f7-957b-000000000222] 49116 1727204689.18560: sending task result for task 127b8e07-fff9-02f7-957b-000000000222 skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 49116 1727204689.18856: no more pending results, returning what we have 49116 1727204689.18860: results queue empty 49116 1727204689.18861: checking for any_errors_fatal 49116 1727204689.18871: done checking for any_errors_fatal 49116 1727204689.18872: checking for max_fail_percentage 49116 1727204689.18874: done checking for max_fail_percentage 49116 1727204689.18876: checking to see if all hosts have failed and the running result is not ok 49116 1727204689.18884: done checking to see if all hosts have failed 49116 1727204689.18885: getting the remaining hosts for this loop 49116 1727204689.18887: done getting the remaining hosts for this loop 49116 1727204689.18894: getting the next task for host managed-node3 49116 1727204689.18901: done getting next task for host managed-node3 49116 1727204689.18904: ^ task is: TASK: Delete dummy interface {{ interface }} 49116 1727204689.18908: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204689.18913: getting variables 49116 1727204689.18915: in VariableManager get_vars() 49116 1727204689.18964: Calling all_inventory to load vars for managed-node3 49116 1727204689.19073: Calling groups_inventory to load vars for managed-node3 49116 1727204689.19076: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204689.19107: Calling all_plugins_play to load vars for managed-node3 49116 1727204689.19111: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204689.19117: done sending task result for task 127b8e07-fff9-02f7-957b-000000000222 49116 1727204689.19120: WORKER PROCESS EXITING 49116 1727204689.19124: Calling groups_plugins_play to load vars for managed-node3 49116 1727204689.19324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204689.19475: done with get_vars() 49116 1727204689.19487: done getting variables 49116 1727204689.19536: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204689.19645: variable 'interface' from source: play vars TASK [Delete dummy interface lsr101] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 15:04:49 -0400 (0:00:00.030) 0:00:12.221 ***** 49116 1727204689.19672: entering _queue_task() for managed-node3/command 49116 1727204689.19943: worker is 1 (out of 1 available) 49116 1727204689.19957: exiting _queue_task() for managed-node3/command 49116 1727204689.19974: done queuing things up, now waiting for results queue to drain 49116 1727204689.19976: waiting for pending results... 49116 1727204689.20170: running TaskExecutor() for managed-node3/TASK: Delete dummy interface lsr101 49116 1727204689.20237: in run() - task 127b8e07-fff9-02f7-957b-000000000223 49116 1727204689.20247: variable 'ansible_search_path' from source: unknown 49116 1727204689.20251: variable 'ansible_search_path' from source: unknown 49116 1727204689.20287: calling self._execute() 49116 1727204689.20367: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204689.20374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204689.20383: variable 'omit' from source: magic vars 49116 1727204689.20685: variable 'ansible_distribution_major_version' from source: facts 49116 1727204689.20695: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204689.20856: variable 'type' from source: play vars 49116 1727204689.20859: variable 'state' from source: include params 49116 1727204689.20862: variable 'interface' from source: play vars 49116 1727204689.20866: variable 'current_interfaces' from source: set_fact 49116 1727204689.20870: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 49116 1727204689.20873: when evaluation is False, skipping this task 49116 1727204689.20875: _execute() done 49116 1727204689.20878: dumping result to json 49116 1727204689.20880: done dumping result, returning 49116 1727204689.20887: done running TaskExecutor() for managed-node3/TASK: Delete dummy interface lsr101 [127b8e07-fff9-02f7-957b-000000000223] 49116 1727204689.20893: sending task result for task 127b8e07-fff9-02f7-957b-000000000223 49116 1727204689.20990: done sending task result for task 127b8e07-fff9-02f7-957b-000000000223 49116 1727204689.20993: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 49116 1727204689.21054: no more pending results, returning what we have 49116 1727204689.21058: results queue empty 49116 1727204689.21059: checking for any_errors_fatal 49116 1727204689.21067: done checking for any_errors_fatal 49116 1727204689.21068: checking for max_fail_percentage 49116 1727204689.21070: done checking for max_fail_percentage 49116 1727204689.21071: checking to see if all hosts have failed and the running result is not ok 49116 1727204689.21072: done checking to see if all hosts have failed 49116 1727204689.21073: getting the remaining hosts for this loop 49116 1727204689.21074: done getting the remaining hosts for this loop 49116 1727204689.21078: getting the next task for host managed-node3 49116 1727204689.21084: done getting next task for host managed-node3 49116 1727204689.21087: ^ task is: TASK: Create tap interface {{ interface }} 49116 1727204689.21090: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204689.21093: getting variables 49116 1727204689.21095: in VariableManager get_vars() 49116 1727204689.21147: Calling all_inventory to load vars for managed-node3 49116 1727204689.21150: Calling groups_inventory to load vars for managed-node3 49116 1727204689.21152: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204689.21168: Calling all_plugins_play to load vars for managed-node3 49116 1727204689.21170: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204689.21174: Calling groups_plugins_play to load vars for managed-node3 49116 1727204689.21375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204689.21519: done with get_vars() 49116 1727204689.21529: done getting variables 49116 1727204689.21583: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204689.21677: variable 'interface' from source: play vars TASK [Create tap interface lsr101] ********************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 15:04:49 -0400 (0:00:00.020) 0:00:12.242 ***** 49116 1727204689.21702: entering _queue_task() for managed-node3/command 49116 1727204689.22004: worker is 1 (out of 1 available) 49116 1727204689.22019: exiting _queue_task() for managed-node3/command 49116 1727204689.22035: done queuing things up, now waiting for results queue to drain 49116 1727204689.22037: waiting for pending results... 49116 1727204689.22587: running TaskExecutor() for managed-node3/TASK: Create tap interface lsr101 49116 1727204689.22598: in run() - task 127b8e07-fff9-02f7-957b-000000000224 49116 1727204689.22602: variable 'ansible_search_path' from source: unknown 49116 1727204689.22604: variable 'ansible_search_path' from source: unknown 49116 1727204689.22607: calling self._execute() 49116 1727204689.22625: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204689.22639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204689.22654: variable 'omit' from source: magic vars 49116 1727204689.23055: variable 'ansible_distribution_major_version' from source: facts 49116 1727204689.23079: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204689.23285: variable 'type' from source: play vars 49116 1727204689.23288: variable 'state' from source: include params 49116 1727204689.23292: variable 'interface' from source: play vars 49116 1727204689.23300: variable 'current_interfaces' from source: set_fact 49116 1727204689.23304: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 49116 1727204689.23307: when evaluation is False, skipping this task 49116 1727204689.23311: _execute() done 49116 1727204689.23313: dumping result to json 49116 1727204689.23318: done dumping result, returning 49116 1727204689.23325: done running TaskExecutor() for managed-node3/TASK: Create tap interface lsr101 [127b8e07-fff9-02f7-957b-000000000224] 49116 1727204689.23331: sending task result for task 127b8e07-fff9-02f7-957b-000000000224 49116 1727204689.23430: done sending task result for task 127b8e07-fff9-02f7-957b-000000000224 49116 1727204689.23433: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 49116 1727204689.23499: no more pending results, returning what we have 49116 1727204689.23503: results queue empty 49116 1727204689.23504: checking for any_errors_fatal 49116 1727204689.23512: done checking for any_errors_fatal 49116 1727204689.23513: checking for max_fail_percentage 49116 1727204689.23514: done checking for max_fail_percentage 49116 1727204689.23515: checking to see if all hosts have failed and the running result is not ok 49116 1727204689.23516: done checking to see if all hosts have failed 49116 1727204689.23517: getting the remaining hosts for this loop 49116 1727204689.23518: done getting the remaining hosts for this loop 49116 1727204689.23523: getting the next task for host managed-node3 49116 1727204689.23529: done getting next task for host managed-node3 49116 1727204689.23532: ^ task is: TASK: Delete tap interface {{ interface }} 49116 1727204689.23535: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204689.23539: getting variables 49116 1727204689.23540: in VariableManager get_vars() 49116 1727204689.23585: Calling all_inventory to load vars for managed-node3 49116 1727204689.23587: Calling groups_inventory to load vars for managed-node3 49116 1727204689.23589: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204689.23602: Calling all_plugins_play to load vars for managed-node3 49116 1727204689.23604: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204689.23607: Calling groups_plugins_play to load vars for managed-node3 49116 1727204689.23776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204689.23926: done with get_vars() 49116 1727204689.23939: done getting variables 49116 1727204689.23989: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204689.24086: variable 'interface' from source: play vars TASK [Delete tap interface lsr101] ********************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 15:04:49 -0400 (0:00:00.024) 0:00:12.266 ***** 49116 1727204689.24112: entering _queue_task() for managed-node3/command 49116 1727204689.24377: worker is 1 (out of 1 available) 49116 1727204689.24392: exiting _queue_task() for managed-node3/command 49116 1727204689.24407: done queuing things up, now waiting for results queue to drain 49116 1727204689.24409: waiting for pending results... 49116 1727204689.24600: running TaskExecutor() for managed-node3/TASK: Delete tap interface lsr101 49116 1727204689.24675: in run() - task 127b8e07-fff9-02f7-957b-000000000225 49116 1727204689.24687: variable 'ansible_search_path' from source: unknown 49116 1727204689.24690: variable 'ansible_search_path' from source: unknown 49116 1727204689.24723: calling self._execute() 49116 1727204689.24802: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204689.24808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204689.24818: variable 'omit' from source: magic vars 49116 1727204689.25125: variable 'ansible_distribution_major_version' from source: facts 49116 1727204689.25138: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204689.25642: variable 'type' from source: play vars 49116 1727204689.25646: variable 'state' from source: include params 49116 1727204689.25650: variable 'interface' from source: play vars 49116 1727204689.25652: variable 'current_interfaces' from source: set_fact 49116 1727204689.25660: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 49116 1727204689.25663: when evaluation is False, skipping this task 49116 1727204689.25667: _execute() done 49116 1727204689.25670: dumping result to json 49116 1727204689.25674: done dumping result, returning 49116 1727204689.25681: done running TaskExecutor() for managed-node3/TASK: Delete tap interface lsr101 [127b8e07-fff9-02f7-957b-000000000225] 49116 1727204689.25686: sending task result for task 127b8e07-fff9-02f7-957b-000000000225 49116 1727204689.25782: done sending task result for task 127b8e07-fff9-02f7-957b-000000000225 49116 1727204689.25784: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 49116 1727204689.25838: no more pending results, returning what we have 49116 1727204689.25841: results queue empty 49116 1727204689.25842: checking for any_errors_fatal 49116 1727204689.25848: done checking for any_errors_fatal 49116 1727204689.25849: checking for max_fail_percentage 49116 1727204689.25851: done checking for max_fail_percentage 49116 1727204689.25852: checking to see if all hosts have failed and the running result is not ok 49116 1727204689.25853: done checking to see if all hosts have failed 49116 1727204689.25853: getting the remaining hosts for this loop 49116 1727204689.25855: done getting the remaining hosts for this loop 49116 1727204689.25859: getting the next task for host managed-node3 49116 1727204689.25869: done getting next task for host managed-node3 49116 1727204689.25873: ^ task is: TASK: Include the task 'assert_device_present.yml' 49116 1727204689.25876: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204689.25880: getting variables 49116 1727204689.25881: in VariableManager get_vars() 49116 1727204689.25925: Calling all_inventory to load vars for managed-node3 49116 1727204689.25928: Calling groups_inventory to load vars for managed-node3 49116 1727204689.25930: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204689.25941: Calling all_plugins_play to load vars for managed-node3 49116 1727204689.25944: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204689.25946: Calling groups_plugins_play to load vars for managed-node3 49116 1727204689.26427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204689.26648: done with get_vars() 49116 1727204689.26661: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:16 Tuesday 24 September 2024 15:04:49 -0400 (0:00:00.026) 0:00:12.292 ***** 49116 1727204689.26767: entering _queue_task() for managed-node3/include_tasks 49116 1727204689.27135: worker is 1 (out of 1 available) 49116 1727204689.27149: exiting _queue_task() for managed-node3/include_tasks 49116 1727204689.27164: done queuing things up, now waiting for results queue to drain 49116 1727204689.27168: waiting for pending results... 49116 1727204689.27595: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' 49116 1727204689.27601: in run() - task 127b8e07-fff9-02f7-957b-00000000000d 49116 1727204689.27605: variable 'ansible_search_path' from source: unknown 49116 1727204689.27674: calling self._execute() 49116 1727204689.27774: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204689.27778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204689.27782: variable 'omit' from source: magic vars 49116 1727204689.28232: variable 'ansible_distribution_major_version' from source: facts 49116 1727204689.28249: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204689.28253: _execute() done 49116 1727204689.28260: dumping result to json 49116 1727204689.28262: done dumping result, returning 49116 1727204689.28271: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' [127b8e07-fff9-02f7-957b-00000000000d] 49116 1727204689.28277: sending task result for task 127b8e07-fff9-02f7-957b-00000000000d 49116 1727204689.28626: no more pending results, returning what we have 49116 1727204689.28631: in VariableManager get_vars() 49116 1727204689.28682: Calling all_inventory to load vars for managed-node3 49116 1727204689.28685: Calling groups_inventory to load vars for managed-node3 49116 1727204689.28688: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204689.28695: done sending task result for task 127b8e07-fff9-02f7-957b-00000000000d 49116 1727204689.28707: Calling all_plugins_play to load vars for managed-node3 49116 1727204689.28711: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204689.28715: Calling groups_plugins_play to load vars for managed-node3 49116 1727204689.28950: WORKER PROCESS EXITING 49116 1727204689.28970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204689.29224: done with get_vars() 49116 1727204689.29233: variable 'ansible_search_path' from source: unknown 49116 1727204689.29249: we have included files to process 49116 1727204689.29250: generating all_blocks data 49116 1727204689.29252: done generating all_blocks data 49116 1727204689.29266: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 49116 1727204689.29268: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 49116 1727204689.29271: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 49116 1727204689.29475: in VariableManager get_vars() 49116 1727204689.29512: done with get_vars() 49116 1727204689.29721: done processing included file 49116 1727204689.29723: iterating over new_blocks loaded from include file 49116 1727204689.29725: in VariableManager get_vars() 49116 1727204689.29744: done with get_vars() 49116 1727204689.29745: filtering new block on tags 49116 1727204689.29766: done filtering new block on tags 49116 1727204689.29769: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node3 49116 1727204689.29775: extending task lists for all hosts with included blocks 49116 1727204689.32761: done extending task lists 49116 1727204689.32763: done processing included files 49116 1727204689.32764: results queue empty 49116 1727204689.32767: checking for any_errors_fatal 49116 1727204689.32771: done checking for any_errors_fatal 49116 1727204689.32772: checking for max_fail_percentage 49116 1727204689.32773: done checking for max_fail_percentage 49116 1727204689.32774: checking to see if all hosts have failed and the running result is not ok 49116 1727204689.32775: done checking to see if all hosts have failed 49116 1727204689.32776: getting the remaining hosts for this loop 49116 1727204689.32785: done getting the remaining hosts for this loop 49116 1727204689.32789: getting the next task for host managed-node3 49116 1727204689.32794: done getting next task for host managed-node3 49116 1727204689.32796: ^ task is: TASK: Include the task 'get_interface_stat.yml' 49116 1727204689.32799: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204689.32802: getting variables 49116 1727204689.32803: in VariableManager get_vars() 49116 1727204689.32826: Calling all_inventory to load vars for managed-node3 49116 1727204689.32829: Calling groups_inventory to load vars for managed-node3 49116 1727204689.32831: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204689.32840: Calling all_plugins_play to load vars for managed-node3 49116 1727204689.32843: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204689.32846: Calling groups_plugins_play to load vars for managed-node3 49116 1727204689.33045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204689.33330: done with get_vars() 49116 1727204689.33343: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 15:04:49 -0400 (0:00:00.066) 0:00:12.359 ***** 49116 1727204689.33442: entering _queue_task() for managed-node3/include_tasks 49116 1727204689.33849: worker is 1 (out of 1 available) 49116 1727204689.33863: exiting _queue_task() for managed-node3/include_tasks 49116 1727204689.33880: done queuing things up, now waiting for results queue to drain 49116 1727204689.33882: waiting for pending results... 49116 1727204689.34287: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 49116 1727204689.34397: in run() - task 127b8e07-fff9-02f7-957b-00000000038b 49116 1727204689.34410: variable 'ansible_search_path' from source: unknown 49116 1727204689.34413: variable 'ansible_search_path' from source: unknown 49116 1727204689.34503: calling self._execute() 49116 1727204689.34568: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204689.34582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204689.34598: variable 'omit' from source: magic vars 49116 1727204689.35084: variable 'ansible_distribution_major_version' from source: facts 49116 1727204689.35165: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204689.35170: _execute() done 49116 1727204689.35174: dumping result to json 49116 1727204689.35176: done dumping result, returning 49116 1727204689.35179: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-02f7-957b-00000000038b] 49116 1727204689.35181: sending task result for task 127b8e07-fff9-02f7-957b-00000000038b 49116 1727204689.35300: no more pending results, returning what we have 49116 1727204689.35307: in VariableManager get_vars() 49116 1727204689.35570: Calling all_inventory to load vars for managed-node3 49116 1727204689.35574: Calling groups_inventory to load vars for managed-node3 49116 1727204689.35578: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204689.35591: Calling all_plugins_play to load vars for managed-node3 49116 1727204689.35595: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204689.35598: Calling groups_plugins_play to load vars for managed-node3 49116 1727204689.35948: done sending task result for task 127b8e07-fff9-02f7-957b-00000000038b 49116 1727204689.35952: WORKER PROCESS EXITING 49116 1727204689.35980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204689.36260: done with get_vars() 49116 1727204689.36273: variable 'ansible_search_path' from source: unknown 49116 1727204689.36274: variable 'ansible_search_path' from source: unknown 49116 1727204689.36316: we have included files to process 49116 1727204689.36318: generating all_blocks data 49116 1727204689.36319: done generating all_blocks data 49116 1727204689.36321: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 49116 1727204689.36331: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 49116 1727204689.36335: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 49116 1727204689.36615: done processing included file 49116 1727204689.36617: iterating over new_blocks loaded from include file 49116 1727204689.36619: in VariableManager get_vars() 49116 1727204689.36642: done with get_vars() 49116 1727204689.36644: filtering new block on tags 49116 1727204689.36673: done filtering new block on tags 49116 1727204689.36675: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 49116 1727204689.36682: extending task lists for all hosts with included blocks 49116 1727204689.36807: done extending task lists 49116 1727204689.36808: done processing included files 49116 1727204689.36809: results queue empty 49116 1727204689.36810: checking for any_errors_fatal 49116 1727204689.36813: done checking for any_errors_fatal 49116 1727204689.36815: checking for max_fail_percentage 49116 1727204689.36817: done checking for max_fail_percentage 49116 1727204689.36818: checking to see if all hosts have failed and the running result is not ok 49116 1727204689.36818: done checking to see if all hosts have failed 49116 1727204689.36819: getting the remaining hosts for this loop 49116 1727204689.36820: done getting the remaining hosts for this loop 49116 1727204689.36823: getting the next task for host managed-node3 49116 1727204689.36827: done getting next task for host managed-node3 49116 1727204689.36830: ^ task is: TASK: Get stat for interface {{ interface }} 49116 1727204689.36833: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204689.36835: getting variables 49116 1727204689.36836: in VariableManager get_vars() 49116 1727204689.36853: Calling all_inventory to load vars for managed-node3 49116 1727204689.36855: Calling groups_inventory to load vars for managed-node3 49116 1727204689.36858: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204689.36864: Calling all_plugins_play to load vars for managed-node3 49116 1727204689.36874: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204689.36882: Calling groups_plugins_play to load vars for managed-node3 49116 1727204689.37118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204689.37380: done with get_vars() 49116 1727204689.37393: done getting variables 49116 1727204689.37601: variable 'interface' from source: play vars TASK [Get stat for interface lsr101] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:04:49 -0400 (0:00:00.041) 0:00:12.401 ***** 49116 1727204689.37647: entering _queue_task() for managed-node3/stat 49116 1727204689.38039: worker is 1 (out of 1 available) 49116 1727204689.38054: exiting _queue_task() for managed-node3/stat 49116 1727204689.38076: done queuing things up, now waiting for results queue to drain 49116 1727204689.38081: waiting for pending results... 49116 1727204689.38413: running TaskExecutor() for managed-node3/TASK: Get stat for interface lsr101 49116 1727204689.38563: in run() - task 127b8e07-fff9-02f7-957b-0000000004a4 49116 1727204689.38628: variable 'ansible_search_path' from source: unknown 49116 1727204689.38633: variable 'ansible_search_path' from source: unknown 49116 1727204689.38664: calling self._execute() 49116 1727204689.38779: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204689.38840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204689.38846: variable 'omit' from source: magic vars 49116 1727204689.39372: variable 'ansible_distribution_major_version' from source: facts 49116 1727204689.39376: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204689.39380: variable 'omit' from source: magic vars 49116 1727204689.39384: variable 'omit' from source: magic vars 49116 1727204689.39523: variable 'interface' from source: play vars 49116 1727204689.39549: variable 'omit' from source: magic vars 49116 1727204689.39603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204689.39661: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204689.39689: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204689.39723: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204689.39748: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204689.39836: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204689.39840: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204689.39842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204689.39944: Set connection var ansible_connection to ssh 49116 1727204689.39955: Set connection var ansible_timeout to 10 49116 1727204689.39971: Set connection var ansible_shell_executable to /bin/sh 49116 1727204689.40053: Set connection var ansible_pipelining to False 49116 1727204689.40056: Set connection var ansible_shell_type to sh 49116 1727204689.40061: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204689.40063: variable 'ansible_shell_executable' from source: unknown 49116 1727204689.40066: variable 'ansible_connection' from source: unknown 49116 1727204689.40068: variable 'ansible_module_compression' from source: unknown 49116 1727204689.40072: variable 'ansible_shell_type' from source: unknown 49116 1727204689.40074: variable 'ansible_shell_executable' from source: unknown 49116 1727204689.40076: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204689.40079: variable 'ansible_pipelining' from source: unknown 49116 1727204689.40081: variable 'ansible_timeout' from source: unknown 49116 1727204689.40084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204689.40375: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49116 1727204689.40381: variable 'omit' from source: magic vars 49116 1727204689.40384: starting attempt loop 49116 1727204689.40386: running the handler 49116 1727204689.40389: _low_level_execute_command(): starting 49116 1727204689.40479: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204689.41389: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204689.41396: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204689.41453: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204689.41525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204689.43358: stdout chunk (state=3): >>>/root <<< 49116 1727204689.43466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204689.43528: stderr chunk (state=3): >>><<< 49116 1727204689.43531: stdout chunk (state=3): >>><<< 49116 1727204689.43558: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204689.43575: _low_level_execute_command(): starting 49116 1727204689.43586: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204689.4355636-50068-168122221248163 `" && echo ansible-tmp-1727204689.4355636-50068-168122221248163="` echo /root/.ansible/tmp/ansible-tmp-1727204689.4355636-50068-168122221248163 `" ) && sleep 0' 49116 1727204689.44198: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204689.44222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204689.44245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204689.44413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204689.46559: stdout chunk (state=3): >>>ansible-tmp-1727204689.4355636-50068-168122221248163=/root/.ansible/tmp/ansible-tmp-1727204689.4355636-50068-168122221248163 <<< 49116 1727204689.46678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204689.46750: stderr chunk (state=3): >>><<< 49116 1727204689.46753: stdout chunk (state=3): >>><<< 49116 1727204689.46771: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204689.4355636-50068-168122221248163=/root/.ansible/tmp/ansible-tmp-1727204689.4355636-50068-168122221248163 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204689.46823: variable 'ansible_module_compression' from source: unknown 49116 1727204689.46870: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 49116 1727204689.46905: variable 'ansible_facts' from source: unknown 49116 1727204689.46975: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204689.4355636-50068-168122221248163/AnsiballZ_stat.py 49116 1727204689.47095: Sending initial data 49116 1727204689.47098: Sent initial data (153 bytes) 49116 1727204689.47626: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204689.47631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 49116 1727204689.47634: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 49116 1727204689.47637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204689.47704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204689.47707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204689.47778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204689.49559: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204689.49628: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204689.49696: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpp5r2sf7c /root/.ansible/tmp/ansible-tmp-1727204689.4355636-50068-168122221248163/AnsiballZ_stat.py <<< 49116 1727204689.49708: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204689.4355636-50068-168122221248163/AnsiballZ_stat.py" <<< 49116 1727204689.49768: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpp5r2sf7c" to remote "/root/.ansible/tmp/ansible-tmp-1727204689.4355636-50068-168122221248163/AnsiballZ_stat.py" <<< 49116 1727204689.49771: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204689.4355636-50068-168122221248163/AnsiballZ_stat.py" <<< 49116 1727204689.50461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204689.50543: stderr chunk (state=3): >>><<< 49116 1727204689.50547: stdout chunk (state=3): >>><<< 49116 1727204689.50573: done transferring module to remote 49116 1727204689.50584: _low_level_execute_command(): starting 49116 1727204689.50590: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204689.4355636-50068-168122221248163/ /root/.ansible/tmp/ansible-tmp-1727204689.4355636-50068-168122221248163/AnsiballZ_stat.py && sleep 0' 49116 1727204689.51100: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204689.51107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 49116 1727204689.51110: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 49116 1727204689.51113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204689.51163: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204689.51170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204689.51254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204689.53279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204689.53346: stderr chunk (state=3): >>><<< 49116 1727204689.53349: stdout chunk (state=3): >>><<< 49116 1727204689.53367: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204689.53370: _low_level_execute_command(): starting 49116 1727204689.53373: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204689.4355636-50068-168122221248163/AnsiballZ_stat.py && sleep 0' 49116 1727204689.53904: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204689.53908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204689.53911: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204689.53914: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204689.53974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204689.53978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204689.53990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204689.54070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204689.71856: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 42176, "dev": 23, "nlink": 1, "atime": 1727204687.6188965, "mtime": 1727204687.6188965, "ctime": 1727204687.6188965, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101", "lnk_target": "../../devices/virtual/net/lsr101", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101", "follow": false, "checksum_algorithm": "sha1"}}} <<< 49116 1727204689.73411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204689.73475: stderr chunk (state=3): >>><<< 49116 1727204689.73479: stdout chunk (state=3): >>><<< 49116 1727204689.73494: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 42176, "dev": 23, "nlink": 1, "atime": 1727204687.6188965, "mtime": 1727204687.6188965, "ctime": 1727204687.6188965, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101", "lnk_target": "../../devices/virtual/net/lsr101", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204689.73544: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr101', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204689.4355636-50068-168122221248163/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204689.73556: _low_level_execute_command(): starting 49116 1727204689.73560: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204689.4355636-50068-168122221248163/ > /dev/null 2>&1 && sleep 0' 49116 1727204689.74048: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204689.74053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204689.74084: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204689.74087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204689.74150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204689.74153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204689.74156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204689.74236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204689.76297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204689.76372: stderr chunk (state=3): >>><<< 49116 1727204689.76376: stdout chunk (state=3): >>><<< 49116 1727204689.76385: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204689.76392: handler run complete 49116 1727204689.76428: attempt loop complete, returning result 49116 1727204689.76432: _execute() done 49116 1727204689.76434: dumping result to json 49116 1727204689.76441: done dumping result, returning 49116 1727204689.76455: done running TaskExecutor() for managed-node3/TASK: Get stat for interface lsr101 [127b8e07-fff9-02f7-957b-0000000004a4] 49116 1727204689.76457: sending task result for task 127b8e07-fff9-02f7-957b-0000000004a4 49116 1727204689.76578: done sending task result for task 127b8e07-fff9-02f7-957b-0000000004a4 49116 1727204689.76581: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727204687.6188965, "block_size": 4096, "blocks": 0, "ctime": 1727204687.6188965, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 42176, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr101", "lnk_target": "../../devices/virtual/net/lsr101", "mode": "0777", "mtime": 1727204687.6188965, "nlink": 1, "path": "/sys/class/net/lsr101", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 49116 1727204689.76684: no more pending results, returning what we have 49116 1727204689.76687: results queue empty 49116 1727204689.76688: checking for any_errors_fatal 49116 1727204689.76689: done checking for any_errors_fatal 49116 1727204689.76690: checking for max_fail_percentage 49116 1727204689.76692: done checking for max_fail_percentage 49116 1727204689.76692: checking to see if all hosts have failed and the running result is not ok 49116 1727204689.76693: done checking to see if all hosts have failed 49116 1727204689.76694: getting the remaining hosts for this loop 49116 1727204689.76695: done getting the remaining hosts for this loop 49116 1727204689.76700: getting the next task for host managed-node3 49116 1727204689.76707: done getting next task for host managed-node3 49116 1727204689.76710: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 49116 1727204689.76713: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204689.76716: getting variables 49116 1727204689.76717: in VariableManager get_vars() 49116 1727204689.76755: Calling all_inventory to load vars for managed-node3 49116 1727204689.76758: Calling groups_inventory to load vars for managed-node3 49116 1727204689.76760: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204689.76783: Calling all_plugins_play to load vars for managed-node3 49116 1727204689.76786: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204689.76790: Calling groups_plugins_play to load vars for managed-node3 49116 1727204689.76952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204689.77135: done with get_vars() 49116 1727204689.77146: done getting variables 49116 1727204689.77231: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 49116 1727204689.77334: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'lsr101'] ************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 15:04:49 -0400 (0:00:00.397) 0:00:12.798 ***** 49116 1727204689.77358: entering _queue_task() for managed-node3/assert 49116 1727204689.77359: Creating lock for assert 49116 1727204689.77648: worker is 1 (out of 1 available) 49116 1727204689.77664: exiting _queue_task() for managed-node3/assert 49116 1727204689.77679: done queuing things up, now waiting for results queue to drain 49116 1727204689.77680: waiting for pending results... 49116 1727204689.77865: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'lsr101' 49116 1727204689.77940: in run() - task 127b8e07-fff9-02f7-957b-00000000038c 49116 1727204689.77948: variable 'ansible_search_path' from source: unknown 49116 1727204689.77953: variable 'ansible_search_path' from source: unknown 49116 1727204689.77988: calling self._execute() 49116 1727204689.78066: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204689.78071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204689.78081: variable 'omit' from source: magic vars 49116 1727204689.78383: variable 'ansible_distribution_major_version' from source: facts 49116 1727204689.78394: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204689.78400: variable 'omit' from source: magic vars 49116 1727204689.78431: variable 'omit' from source: magic vars 49116 1727204689.78512: variable 'interface' from source: play vars 49116 1727204689.78527: variable 'omit' from source: magic vars 49116 1727204689.78568: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204689.78602: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204689.78619: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204689.78637: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204689.78649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204689.78677: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204689.78680: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204689.78685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204689.78759: Set connection var ansible_connection to ssh 49116 1727204689.78771: Set connection var ansible_timeout to 10 49116 1727204689.78779: Set connection var ansible_shell_executable to /bin/sh 49116 1727204689.78784: Set connection var ansible_pipelining to False 49116 1727204689.78787: Set connection var ansible_shell_type to sh 49116 1727204689.78792: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204689.78816: variable 'ansible_shell_executable' from source: unknown 49116 1727204689.78819: variable 'ansible_connection' from source: unknown 49116 1727204689.78822: variable 'ansible_module_compression' from source: unknown 49116 1727204689.78825: variable 'ansible_shell_type' from source: unknown 49116 1727204689.78828: variable 'ansible_shell_executable' from source: unknown 49116 1727204689.78830: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204689.78835: variable 'ansible_pipelining' from source: unknown 49116 1727204689.78837: variable 'ansible_timeout' from source: unknown 49116 1727204689.78839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204689.78957: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204689.78968: variable 'omit' from source: magic vars 49116 1727204689.78975: starting attempt loop 49116 1727204689.78977: running the handler 49116 1727204689.79088: variable 'interface_stat' from source: set_fact 49116 1727204689.79105: Evaluated conditional (interface_stat.stat.exists): True 49116 1727204689.79115: handler run complete 49116 1727204689.79134: attempt loop complete, returning result 49116 1727204689.79137: _execute() done 49116 1727204689.79140: dumping result to json 49116 1727204689.79142: done dumping result, returning 49116 1727204689.79144: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'lsr101' [127b8e07-fff9-02f7-957b-00000000038c] 49116 1727204689.79153: sending task result for task 127b8e07-fff9-02f7-957b-00000000038c 49116 1727204689.79251: done sending task result for task 127b8e07-fff9-02f7-957b-00000000038c 49116 1727204689.79255: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 49116 1727204689.79310: no more pending results, returning what we have 49116 1727204689.79313: results queue empty 49116 1727204689.79314: checking for any_errors_fatal 49116 1727204689.79324: done checking for any_errors_fatal 49116 1727204689.79325: checking for max_fail_percentage 49116 1727204689.79327: done checking for max_fail_percentage 49116 1727204689.79328: checking to see if all hosts have failed and the running result is not ok 49116 1727204689.79328: done checking to see if all hosts have failed 49116 1727204689.79329: getting the remaining hosts for this loop 49116 1727204689.79331: done getting the remaining hosts for this loop 49116 1727204689.79337: getting the next task for host managed-node3 49116 1727204689.79345: done getting next task for host managed-node3 49116 1727204689.79348: ^ task is: TASK: TEST: I can configure the MTU for a vlan interface without autoconnect. 49116 1727204689.79350: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204689.79353: getting variables 49116 1727204689.79354: in VariableManager get_vars() 49116 1727204689.79407: Calling all_inventory to load vars for managed-node3 49116 1727204689.79410: Calling groups_inventory to load vars for managed-node3 49116 1727204689.79412: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204689.79424: Calling all_plugins_play to load vars for managed-node3 49116 1727204689.79426: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204689.79429: Calling groups_plugins_play to load vars for managed-node3 49116 1727204689.79594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204689.79738: done with get_vars() 49116 1727204689.79749: done getting variables 49116 1727204689.79797: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST: I can configure the MTU for a vlan interface without autoconnect.] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:18 Tuesday 24 September 2024 15:04:49 -0400 (0:00:00.024) 0:00:12.823 ***** 49116 1727204689.79823: entering _queue_task() for managed-node3/debug 49116 1727204689.80080: worker is 1 (out of 1 available) 49116 1727204689.80097: exiting _queue_task() for managed-node3/debug 49116 1727204689.80110: done queuing things up, now waiting for results queue to drain 49116 1727204689.80111: waiting for pending results... 49116 1727204689.80485: running TaskExecutor() for managed-node3/TASK: TEST: I can configure the MTU for a vlan interface without autoconnect. 49116 1727204689.80496: in run() - task 127b8e07-fff9-02f7-957b-00000000000e 49116 1727204689.80500: variable 'ansible_search_path' from source: unknown 49116 1727204689.80522: calling self._execute() 49116 1727204689.80624: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204689.80641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204689.80658: variable 'omit' from source: magic vars 49116 1727204689.81068: variable 'ansible_distribution_major_version' from source: facts 49116 1727204689.81086: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204689.81098: variable 'omit' from source: magic vars 49116 1727204689.81172: variable 'omit' from source: magic vars 49116 1727204689.81176: variable 'omit' from source: magic vars 49116 1727204689.81226: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204689.81273: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204689.81300: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204689.81324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204689.81347: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204689.81392: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204689.81396: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204689.81399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204689.81504: Set connection var ansible_connection to ssh 49116 1727204689.81515: Set connection var ansible_timeout to 10 49116 1727204689.81523: Set connection var ansible_shell_executable to /bin/sh 49116 1727204689.81528: Set connection var ansible_pipelining to False 49116 1727204689.81530: Set connection var ansible_shell_type to sh 49116 1727204689.81540: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204689.81561: variable 'ansible_shell_executable' from source: unknown 49116 1727204689.81565: variable 'ansible_connection' from source: unknown 49116 1727204689.81570: variable 'ansible_module_compression' from source: unknown 49116 1727204689.81573: variable 'ansible_shell_type' from source: unknown 49116 1727204689.81576: variable 'ansible_shell_executable' from source: unknown 49116 1727204689.81578: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204689.81581: variable 'ansible_pipelining' from source: unknown 49116 1727204689.81585: variable 'ansible_timeout' from source: unknown 49116 1727204689.81588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204689.81710: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204689.81721: variable 'omit' from source: magic vars 49116 1727204689.81727: starting attempt loop 49116 1727204689.81730: running the handler 49116 1727204689.81777: handler run complete 49116 1727204689.81791: attempt loop complete, returning result 49116 1727204689.81795: _execute() done 49116 1727204689.81797: dumping result to json 49116 1727204689.81800: done dumping result, returning 49116 1727204689.81818: done running TaskExecutor() for managed-node3/TASK: TEST: I can configure the MTU for a vlan interface without autoconnect. [127b8e07-fff9-02f7-957b-00000000000e] 49116 1727204689.81821: sending task result for task 127b8e07-fff9-02f7-957b-00000000000e 49116 1727204689.81915: done sending task result for task 127b8e07-fff9-02f7-957b-00000000000e 49116 1727204689.81918: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: ################################################## 49116 1727204689.81970: no more pending results, returning what we have 49116 1727204689.81973: results queue empty 49116 1727204689.81975: checking for any_errors_fatal 49116 1727204689.81983: done checking for any_errors_fatal 49116 1727204689.81984: checking for max_fail_percentage 49116 1727204689.81986: done checking for max_fail_percentage 49116 1727204689.81987: checking to see if all hosts have failed and the running result is not ok 49116 1727204689.81988: done checking to see if all hosts have failed 49116 1727204689.81988: getting the remaining hosts for this loop 49116 1727204689.81990: done getting the remaining hosts for this loop 49116 1727204689.81994: getting the next task for host managed-node3 49116 1727204689.82001: done getting next task for host managed-node3 49116 1727204689.82007: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 49116 1727204689.82010: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204689.82028: getting variables 49116 1727204689.82030: in VariableManager get_vars() 49116 1727204689.82082: Calling all_inventory to load vars for managed-node3 49116 1727204689.82085: Calling groups_inventory to load vars for managed-node3 49116 1727204689.82087: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204689.82098: Calling all_plugins_play to load vars for managed-node3 49116 1727204689.82100: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204689.82103: Calling groups_plugins_play to load vars for managed-node3 49116 1727204689.82306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204689.82459: done with get_vars() 49116 1727204689.82470: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:04:49 -0400 (0:00:00.027) 0:00:12.850 ***** 49116 1727204689.82550: entering _queue_task() for managed-node3/include_tasks 49116 1727204689.82811: worker is 1 (out of 1 available) 49116 1727204689.82826: exiting _queue_task() for managed-node3/include_tasks 49116 1727204689.82843: done queuing things up, now waiting for results queue to drain 49116 1727204689.82844: waiting for pending results... 49116 1727204689.83024: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 49116 1727204689.83115: in run() - task 127b8e07-fff9-02f7-957b-000000000016 49116 1727204689.83128: variable 'ansible_search_path' from source: unknown 49116 1727204689.83134: variable 'ansible_search_path' from source: unknown 49116 1727204689.83164: calling self._execute() 49116 1727204689.83238: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204689.83242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204689.83250: variable 'omit' from source: magic vars 49116 1727204689.83550: variable 'ansible_distribution_major_version' from source: facts 49116 1727204689.83561: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204689.83570: _execute() done 49116 1727204689.83574: dumping result to json 49116 1727204689.83576: done dumping result, returning 49116 1727204689.83584: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-02f7-957b-000000000016] 49116 1727204689.83589: sending task result for task 127b8e07-fff9-02f7-957b-000000000016 49116 1727204689.83738: no more pending results, returning what we have 49116 1727204689.83744: in VariableManager get_vars() 49116 1727204689.83797: Calling all_inventory to load vars for managed-node3 49116 1727204689.83800: Calling groups_inventory to load vars for managed-node3 49116 1727204689.83802: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204689.83813: Calling all_plugins_play to load vars for managed-node3 49116 1727204689.83815: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204689.83818: Calling groups_plugins_play to load vars for managed-node3 49116 1727204689.83993: done sending task result for task 127b8e07-fff9-02f7-957b-000000000016 49116 1727204689.83996: WORKER PROCESS EXITING 49116 1727204689.84007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204689.84157: done with get_vars() 49116 1727204689.84167: variable 'ansible_search_path' from source: unknown 49116 1727204689.84168: variable 'ansible_search_path' from source: unknown 49116 1727204689.84203: we have included files to process 49116 1727204689.84204: generating all_blocks data 49116 1727204689.84205: done generating all_blocks data 49116 1727204689.84209: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 49116 1727204689.84209: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 49116 1727204689.84211: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 49116 1727204689.84926: done processing included file 49116 1727204689.84929: iterating over new_blocks loaded from include file 49116 1727204689.84930: in VariableManager get_vars() 49116 1727204689.84956: done with get_vars() 49116 1727204689.84957: filtering new block on tags 49116 1727204689.84976: done filtering new block on tags 49116 1727204689.84979: in VariableManager get_vars() 49116 1727204689.85004: done with get_vars() 49116 1727204689.85006: filtering new block on tags 49116 1727204689.85029: done filtering new block on tags 49116 1727204689.85031: in VariableManager get_vars() 49116 1727204689.85057: done with get_vars() 49116 1727204689.85059: filtering new block on tags 49116 1727204689.85080: done filtering new block on tags 49116 1727204689.85082: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 49116 1727204689.85088: extending task lists for all hosts with included blocks 49116 1727204689.85905: done extending task lists 49116 1727204689.85907: done processing included files 49116 1727204689.85908: results queue empty 49116 1727204689.85909: checking for any_errors_fatal 49116 1727204689.85913: done checking for any_errors_fatal 49116 1727204689.85914: checking for max_fail_percentage 49116 1727204689.85915: done checking for max_fail_percentage 49116 1727204689.85916: checking to see if all hosts have failed and the running result is not ok 49116 1727204689.85917: done checking to see if all hosts have failed 49116 1727204689.85918: getting the remaining hosts for this loop 49116 1727204689.85919: done getting the remaining hosts for this loop 49116 1727204689.85922: getting the next task for host managed-node3 49116 1727204689.85927: done getting next task for host managed-node3 49116 1727204689.85930: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 49116 1727204689.85933: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204689.85943: getting variables 49116 1727204689.85945: in VariableManager get_vars() 49116 1727204689.85967: Calling all_inventory to load vars for managed-node3 49116 1727204689.85970: Calling groups_inventory to load vars for managed-node3 49116 1727204689.85972: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204689.85978: Calling all_plugins_play to load vars for managed-node3 49116 1727204689.85981: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204689.85984: Calling groups_plugins_play to load vars for managed-node3 49116 1727204689.86189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204689.86433: done with get_vars() 49116 1727204689.86447: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:04:49 -0400 (0:00:00.039) 0:00:12.890 ***** 49116 1727204689.86528: entering _queue_task() for managed-node3/setup 49116 1727204689.86895: worker is 1 (out of 1 available) 49116 1727204689.86911: exiting _queue_task() for managed-node3/setup 49116 1727204689.86925: done queuing things up, now waiting for results queue to drain 49116 1727204689.86927: waiting for pending results... 49116 1727204689.87296: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 49116 1727204689.87423: in run() - task 127b8e07-fff9-02f7-957b-0000000004bf 49116 1727204689.87428: variable 'ansible_search_path' from source: unknown 49116 1727204689.87431: variable 'ansible_search_path' from source: unknown 49116 1727204689.87473: calling self._execute() 49116 1727204689.87624: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204689.87628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204689.87640: variable 'omit' from source: magic vars 49116 1727204689.88009: variable 'ansible_distribution_major_version' from source: facts 49116 1727204689.88021: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204689.88277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204689.90726: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204689.90852: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204689.90857: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204689.90883: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204689.90911: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204689.91005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204689.91036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204689.91063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204689.91102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204689.91115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204689.91173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204689.91196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204689.91222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204689.91258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204689.91274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204689.91461: variable '__network_required_facts' from source: role '' defaults 49116 1727204689.91475: variable 'ansible_facts' from source: unknown 49116 1727204689.91590: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 49116 1727204689.91594: when evaluation is False, skipping this task 49116 1727204689.91596: _execute() done 49116 1727204689.91599: dumping result to json 49116 1727204689.91601: done dumping result, returning 49116 1727204689.91611: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-02f7-957b-0000000004bf] 49116 1727204689.91617: sending task result for task 127b8e07-fff9-02f7-957b-0000000004bf skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 49116 1727204689.91793: no more pending results, returning what we have 49116 1727204689.91799: results queue empty 49116 1727204689.91801: checking for any_errors_fatal 49116 1727204689.91802: done checking for any_errors_fatal 49116 1727204689.91803: checking for max_fail_percentage 49116 1727204689.91805: done checking for max_fail_percentage 49116 1727204689.91806: checking to see if all hosts have failed and the running result is not ok 49116 1727204689.91807: done checking to see if all hosts have failed 49116 1727204689.91808: getting the remaining hosts for this loop 49116 1727204689.91810: done getting the remaining hosts for this loop 49116 1727204689.91819: getting the next task for host managed-node3 49116 1727204689.91829: done getting next task for host managed-node3 49116 1727204689.91833: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 49116 1727204689.91837: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204689.91855: getting variables 49116 1727204689.91857: in VariableManager get_vars() 49116 1727204689.91910: Calling all_inventory to load vars for managed-node3 49116 1727204689.91913: Calling groups_inventory to load vars for managed-node3 49116 1727204689.91916: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204689.91930: Calling all_plugins_play to load vars for managed-node3 49116 1727204689.91933: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204689.91937: Calling groups_plugins_play to load vars for managed-node3 49116 1727204689.92341: done sending task result for task 127b8e07-fff9-02f7-957b-0000000004bf 49116 1727204689.92346: WORKER PROCESS EXITING 49116 1727204689.92435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204689.92724: done with get_vars() 49116 1727204689.92739: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:04:49 -0400 (0:00:00.063) 0:00:12.953 ***** 49116 1727204689.92863: entering _queue_task() for managed-node3/stat 49116 1727204689.93230: worker is 1 (out of 1 available) 49116 1727204689.93244: exiting _queue_task() for managed-node3/stat 49116 1727204689.93258: done queuing things up, now waiting for results queue to drain 49116 1727204689.93259: waiting for pending results... 49116 1727204689.93570: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 49116 1727204689.93727: in run() - task 127b8e07-fff9-02f7-957b-0000000004c1 49116 1727204689.93746: variable 'ansible_search_path' from source: unknown 49116 1727204689.93750: variable 'ansible_search_path' from source: unknown 49116 1727204689.93796: calling self._execute() 49116 1727204689.93888: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204689.93901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204689.93912: variable 'omit' from source: magic vars 49116 1727204689.94456: variable 'ansible_distribution_major_version' from source: facts 49116 1727204689.94460: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204689.94522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49116 1727204689.94813: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49116 1727204689.94862: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49116 1727204689.94900: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49116 1727204689.94937: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49116 1727204689.95035: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49116 1727204689.95059: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49116 1727204689.95090: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204689.95124: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49116 1727204689.95237: variable '__network_is_ostree' from source: set_fact 49116 1727204689.95244: Evaluated conditional (not __network_is_ostree is defined): False 49116 1727204689.95247: when evaluation is False, skipping this task 49116 1727204689.95250: _execute() done 49116 1727204689.95253: dumping result to json 49116 1727204689.95255: done dumping result, returning 49116 1727204689.95267: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-02f7-957b-0000000004c1] 49116 1727204689.95273: sending task result for task 127b8e07-fff9-02f7-957b-0000000004c1 49116 1727204689.95389: done sending task result for task 127b8e07-fff9-02f7-957b-0000000004c1 49116 1727204689.95393: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 49116 1727204689.95457: no more pending results, returning what we have 49116 1727204689.95462: results queue empty 49116 1727204689.95463: checking for any_errors_fatal 49116 1727204689.95471: done checking for any_errors_fatal 49116 1727204689.95472: checking for max_fail_percentage 49116 1727204689.95474: done checking for max_fail_percentage 49116 1727204689.95475: checking to see if all hosts have failed and the running result is not ok 49116 1727204689.95476: done checking to see if all hosts have failed 49116 1727204689.95477: getting the remaining hosts for this loop 49116 1727204689.95479: done getting the remaining hosts for this loop 49116 1727204689.95484: getting the next task for host managed-node3 49116 1727204689.95491: done getting next task for host managed-node3 49116 1727204689.95495: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 49116 1727204689.95499: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204689.95514: getting variables 49116 1727204689.95516: in VariableManager get_vars() 49116 1727204689.95802: Calling all_inventory to load vars for managed-node3 49116 1727204689.95805: Calling groups_inventory to load vars for managed-node3 49116 1727204689.95809: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204689.95820: Calling all_plugins_play to load vars for managed-node3 49116 1727204689.95823: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204689.95827: Calling groups_plugins_play to load vars for managed-node3 49116 1727204689.96031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204689.96254: done with get_vars() 49116 1727204689.96272: done getting variables 49116 1727204689.96342: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:04:49 -0400 (0:00:00.035) 0:00:12.988 ***** 49116 1727204689.96391: entering _queue_task() for managed-node3/set_fact 49116 1727204689.96871: worker is 1 (out of 1 available) 49116 1727204689.96886: exiting _queue_task() for managed-node3/set_fact 49116 1727204689.96900: done queuing things up, now waiting for results queue to drain 49116 1727204689.96901: waiting for pending results... 49116 1727204689.97193: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 49116 1727204689.97297: in run() - task 127b8e07-fff9-02f7-957b-0000000004c2 49116 1727204689.97329: variable 'ansible_search_path' from source: unknown 49116 1727204689.97336: variable 'ansible_search_path' from source: unknown 49116 1727204689.97357: calling self._execute() 49116 1727204689.97444: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204689.97502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204689.97510: variable 'omit' from source: magic vars 49116 1727204689.97866: variable 'ansible_distribution_major_version' from source: facts 49116 1727204689.97880: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204689.98073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49116 1727204689.98618: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49116 1727204689.98622: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49116 1727204689.98625: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49116 1727204689.98628: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49116 1727204689.98697: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49116 1727204689.98724: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49116 1727204689.98750: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204689.98783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49116 1727204689.98882: variable '__network_is_ostree' from source: set_fact 49116 1727204689.98899: Evaluated conditional (not __network_is_ostree is defined): False 49116 1727204689.98902: when evaluation is False, skipping this task 49116 1727204689.98905: _execute() done 49116 1727204689.98908: dumping result to json 49116 1727204689.98910: done dumping result, returning 49116 1727204689.98918: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-02f7-957b-0000000004c2] 49116 1727204689.98923: sending task result for task 127b8e07-fff9-02f7-957b-0000000004c2 skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 49116 1727204689.99132: no more pending results, returning what we have 49116 1727204689.99136: results queue empty 49116 1727204689.99138: checking for any_errors_fatal 49116 1727204689.99145: done checking for any_errors_fatal 49116 1727204689.99146: checking for max_fail_percentage 49116 1727204689.99148: done checking for max_fail_percentage 49116 1727204689.99149: checking to see if all hosts have failed and the running result is not ok 49116 1727204689.99150: done checking to see if all hosts have failed 49116 1727204689.99151: getting the remaining hosts for this loop 49116 1727204689.99153: done getting the remaining hosts for this loop 49116 1727204689.99158: getting the next task for host managed-node3 49116 1727204689.99172: done getting next task for host managed-node3 49116 1727204689.99176: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 49116 1727204689.99180: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204689.99197: getting variables 49116 1727204689.99199: in VariableManager get_vars() 49116 1727204689.99248: Calling all_inventory to load vars for managed-node3 49116 1727204689.99252: Calling groups_inventory to load vars for managed-node3 49116 1727204689.99254: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204689.99270: Calling all_plugins_play to load vars for managed-node3 49116 1727204689.99273: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204689.99278: Calling groups_plugins_play to load vars for managed-node3 49116 1727204689.99750: done sending task result for task 127b8e07-fff9-02f7-957b-0000000004c2 49116 1727204689.99755: WORKER PROCESS EXITING 49116 1727204689.99776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204690.00017: done with get_vars() 49116 1727204690.00030: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:04:49 -0400 (0:00:00.037) 0:00:13.026 ***** 49116 1727204690.00140: entering _queue_task() for managed-node3/service_facts 49116 1727204690.00142: Creating lock for service_facts 49116 1727204690.00512: worker is 1 (out of 1 available) 49116 1727204690.00526: exiting _queue_task() for managed-node3/service_facts 49116 1727204690.00540: done queuing things up, now waiting for results queue to drain 49116 1727204690.00541: waiting for pending results... 49116 1727204690.00950: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 49116 1727204690.01018: in run() - task 127b8e07-fff9-02f7-957b-0000000004c4 49116 1727204690.01044: variable 'ansible_search_path' from source: unknown 49116 1727204690.01048: variable 'ansible_search_path' from source: unknown 49116 1727204690.01079: calling self._execute() 49116 1727204690.01177: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204690.01183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204690.01196: variable 'omit' from source: magic vars 49116 1727204690.01674: variable 'ansible_distribution_major_version' from source: facts 49116 1727204690.01677: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204690.01680: variable 'omit' from source: magic vars 49116 1727204690.01709: variable 'omit' from source: magic vars 49116 1727204690.01749: variable 'omit' from source: magic vars 49116 1727204690.01800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204690.01840: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204690.01860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204690.01883: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204690.01896: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204690.01935: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204690.01939: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204690.01941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204690.02071: Set connection var ansible_connection to ssh 49116 1727204690.02075: Set connection var ansible_timeout to 10 49116 1727204690.02078: Set connection var ansible_shell_executable to /bin/sh 49116 1727204690.02081: Set connection var ansible_pipelining to False 49116 1727204690.02084: Set connection var ansible_shell_type to sh 49116 1727204690.02129: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204690.02140: variable 'ansible_shell_executable' from source: unknown 49116 1727204690.02143: variable 'ansible_connection' from source: unknown 49116 1727204690.02148: variable 'ansible_module_compression' from source: unknown 49116 1727204690.02150: variable 'ansible_shell_type' from source: unknown 49116 1727204690.02153: variable 'ansible_shell_executable' from source: unknown 49116 1727204690.02155: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204690.02157: variable 'ansible_pipelining' from source: unknown 49116 1727204690.02160: variable 'ansible_timeout' from source: unknown 49116 1727204690.02162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204690.02460: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49116 1727204690.02467: variable 'omit' from source: magic vars 49116 1727204690.02478: starting attempt loop 49116 1727204690.02481: running the handler 49116 1727204690.02483: _low_level_execute_command(): starting 49116 1727204690.02485: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204690.03295: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204690.03349: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204690.03420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204690.05319: stdout chunk (state=3): >>>/root <<< 49116 1727204690.05512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204690.05516: stdout chunk (state=3): >>><<< 49116 1727204690.05519: stderr chunk (state=3): >>><<< 49116 1727204690.05542: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204690.05564: _low_level_execute_command(): starting 49116 1727204690.05584: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204690.0554936-50097-47844087550635 `" && echo ansible-tmp-1727204690.0554936-50097-47844087550635="` echo /root/.ansible/tmp/ansible-tmp-1727204690.0554936-50097-47844087550635 `" ) && sleep 0' 49116 1727204690.06333: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204690.06393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204690.06487: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204690.06500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204690.06512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204690.06595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204690.08742: stdout chunk (state=3): >>>ansible-tmp-1727204690.0554936-50097-47844087550635=/root/.ansible/tmp/ansible-tmp-1727204690.0554936-50097-47844087550635 <<< 49116 1727204690.08846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204690.08912: stderr chunk (state=3): >>><<< 49116 1727204690.08915: stdout chunk (state=3): >>><<< 49116 1727204690.08968: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204690.0554936-50097-47844087550635=/root/.ansible/tmp/ansible-tmp-1727204690.0554936-50097-47844087550635 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204690.08997: variable 'ansible_module_compression' from source: unknown 49116 1727204690.09043: ANSIBALLZ: Using lock for service_facts 49116 1727204690.09046: ANSIBALLZ: Acquiring lock 49116 1727204690.09050: ANSIBALLZ: Lock acquired: 139720114793712 49116 1727204690.09053: ANSIBALLZ: Creating module 49116 1727204690.20975: ANSIBALLZ: Writing module into payload 49116 1727204690.21052: ANSIBALLZ: Writing module 49116 1727204690.21075: ANSIBALLZ: Renaming module 49116 1727204690.21082: ANSIBALLZ: Done creating module 49116 1727204690.21099: variable 'ansible_facts' from source: unknown 49116 1727204690.21149: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204690.0554936-50097-47844087550635/AnsiballZ_service_facts.py 49116 1727204690.21273: Sending initial data 49116 1727204690.21277: Sent initial data (161 bytes) 49116 1727204690.21770: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204690.21779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204690.21798: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204690.21801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204690.21864: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204690.21879: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204690.21882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204690.21956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204690.23742: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204690.23818: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204690.23884: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpq4yc_ej_ /root/.ansible/tmp/ansible-tmp-1727204690.0554936-50097-47844087550635/AnsiballZ_service_facts.py <<< 49116 1727204690.23888: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204690.0554936-50097-47844087550635/AnsiballZ_service_facts.py" <<< 49116 1727204690.23945: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 49116 1727204690.23950: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpq4yc_ej_" to remote "/root/.ansible/tmp/ansible-tmp-1727204690.0554936-50097-47844087550635/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204690.0554936-50097-47844087550635/AnsiballZ_service_facts.py" <<< 49116 1727204690.24642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204690.24722: stderr chunk (state=3): >>><<< 49116 1727204690.24726: stdout chunk (state=3): >>><<< 49116 1727204690.24745: done transferring module to remote 49116 1727204690.24756: _low_level_execute_command(): starting 49116 1727204690.24761: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204690.0554936-50097-47844087550635/ /root/.ansible/tmp/ansible-tmp-1727204690.0554936-50097-47844087550635/AnsiballZ_service_facts.py && sleep 0' 49116 1727204690.25243: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204690.25281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204690.25285: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204690.25287: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204690.25290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204690.25340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204690.25344: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204690.25425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204690.27430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204690.27493: stderr chunk (state=3): >>><<< 49116 1727204690.27497: stdout chunk (state=3): >>><<< 49116 1727204690.27512: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204690.27515: _low_level_execute_command(): starting 49116 1727204690.27520: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204690.0554936-50097-47844087550635/AnsiballZ_service_facts.py && sleep 0' 49116 1727204690.28035: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204690.28040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49116 1727204690.28043: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204690.28047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204690.28109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204690.28117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204690.28119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204690.28198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204692.75242: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-<<< 49116 1727204692.75363: stdout chunk (state=3): >>>utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 49116 1727204692.77018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204692.77513: stderr chunk (state=3): >>><<< 49116 1727204692.77518: stdout chunk (state=3): >>><<< 49116 1727204692.77676: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204692.78545: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204690.0554936-50097-47844087550635/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204692.78569: _low_level_execute_command(): starting 49116 1727204692.78587: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204690.0554936-50097-47844087550635/ > /dev/null 2>&1 && sleep 0' 49116 1727204692.79362: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204692.79457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204692.79505: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204692.79587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204692.81774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204692.81778: stdout chunk (state=3): >>><<< 49116 1727204692.81781: stderr chunk (state=3): >>><<< 49116 1727204692.81799: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204692.81993: handler run complete 49116 1727204692.82086: variable 'ansible_facts' from source: unknown 49116 1727204692.82282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204692.82836: variable 'ansible_facts' from source: unknown 49116 1727204692.82997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204692.83267: attempt loop complete, returning result 49116 1727204692.83282: _execute() done 49116 1727204692.83291: dumping result to json 49116 1727204692.83370: done dumping result, returning 49116 1727204692.83386: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-02f7-957b-0000000004c4] 49116 1727204692.83397: sending task result for task 127b8e07-fff9-02f7-957b-0000000004c4 49116 1727204692.84990: done sending task result for task 127b8e07-fff9-02f7-957b-0000000004c4 49116 1727204692.84994: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 49116 1727204692.85090: no more pending results, returning what we have 49116 1727204692.85093: results queue empty 49116 1727204692.85095: checking for any_errors_fatal 49116 1727204692.85098: done checking for any_errors_fatal 49116 1727204692.85099: checking for max_fail_percentage 49116 1727204692.85100: done checking for max_fail_percentage 49116 1727204692.85101: checking to see if all hosts have failed and the running result is not ok 49116 1727204692.85102: done checking to see if all hosts have failed 49116 1727204692.85103: getting the remaining hosts for this loop 49116 1727204692.85105: done getting the remaining hosts for this loop 49116 1727204692.85109: getting the next task for host managed-node3 49116 1727204692.85114: done getting next task for host managed-node3 49116 1727204692.85118: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 49116 1727204692.85122: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204692.85131: getting variables 49116 1727204692.85132: in VariableManager get_vars() 49116 1727204692.85172: Calling all_inventory to load vars for managed-node3 49116 1727204692.85175: Calling groups_inventory to load vars for managed-node3 49116 1727204692.85178: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204692.85188: Calling all_plugins_play to load vars for managed-node3 49116 1727204692.85191: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204692.85194: Calling groups_plugins_play to load vars for managed-node3 49116 1727204692.85678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204692.86352: done with get_vars() 49116 1727204692.86379: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:04:52 -0400 (0:00:02.863) 0:00:15.889 ***** 49116 1727204692.86494: entering _queue_task() for managed-node3/package_facts 49116 1727204692.86496: Creating lock for package_facts 49116 1727204692.87073: worker is 1 (out of 1 available) 49116 1727204692.87088: exiting _queue_task() for managed-node3/package_facts 49116 1727204692.87100: done queuing things up, now waiting for results queue to drain 49116 1727204692.87101: waiting for pending results... 49116 1727204692.87233: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 49116 1727204692.87439: in run() - task 127b8e07-fff9-02f7-957b-0000000004c5 49116 1727204692.87443: variable 'ansible_search_path' from source: unknown 49116 1727204692.87446: variable 'ansible_search_path' from source: unknown 49116 1727204692.87484: calling self._execute() 49116 1727204692.87592: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204692.87618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204692.87628: variable 'omit' from source: magic vars 49116 1727204692.88075: variable 'ansible_distribution_major_version' from source: facts 49116 1727204692.88080: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204692.88094: variable 'omit' from source: magic vars 49116 1727204692.88201: variable 'omit' from source: magic vars 49116 1727204692.88271: variable 'omit' from source: magic vars 49116 1727204692.88314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204692.88363: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204692.88418: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204692.88428: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204692.88492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204692.88496: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204692.88498: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204692.88505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204692.88635: Set connection var ansible_connection to ssh 49116 1727204692.88656: Set connection var ansible_timeout to 10 49116 1727204692.88671: Set connection var ansible_shell_executable to /bin/sh 49116 1727204692.88682: Set connection var ansible_pipelining to False 49116 1727204692.88690: Set connection var ansible_shell_type to sh 49116 1727204692.88701: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204692.88744: variable 'ansible_shell_executable' from source: unknown 49116 1727204692.88750: variable 'ansible_connection' from source: unknown 49116 1727204692.88760: variable 'ansible_module_compression' from source: unknown 49116 1727204692.88769: variable 'ansible_shell_type' from source: unknown 49116 1727204692.88777: variable 'ansible_shell_executable' from source: unknown 49116 1727204692.88784: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204692.88791: variable 'ansible_pipelining' from source: unknown 49116 1727204692.88797: variable 'ansible_timeout' from source: unknown 49116 1727204692.88818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204692.89063: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49116 1727204692.89148: variable 'omit' from source: magic vars 49116 1727204692.89151: starting attempt loop 49116 1727204692.89154: running the handler 49116 1727204692.89157: _low_level_execute_command(): starting 49116 1727204692.89159: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204692.89999: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204692.90127: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204692.90139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204692.90347: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204692.90365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204692.92177: stdout chunk (state=3): >>>/root <<< 49116 1727204692.92391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204692.92396: stdout chunk (state=3): >>><<< 49116 1727204692.92398: stderr chunk (state=3): >>><<< 49116 1727204692.92422: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204692.92471: _low_level_execute_command(): starting 49116 1727204692.92475: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204692.9243128-50168-36993261138429 `" && echo ansible-tmp-1727204692.9243128-50168-36993261138429="` echo /root/.ansible/tmp/ansible-tmp-1727204692.9243128-50168-36993261138429 `" ) && sleep 0' 49116 1727204692.93289: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204692.93354: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204692.93438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204692.95667: stdout chunk (state=3): >>>ansible-tmp-1727204692.9243128-50168-36993261138429=/root/.ansible/tmp/ansible-tmp-1727204692.9243128-50168-36993261138429 <<< 49116 1727204692.95888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204692.95892: stdout chunk (state=3): >>><<< 49116 1727204692.95895: stderr chunk (state=3): >>><<< 49116 1727204692.95911: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204692.9243128-50168-36993261138429=/root/.ansible/tmp/ansible-tmp-1727204692.9243128-50168-36993261138429 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204692.95984: variable 'ansible_module_compression' from source: unknown 49116 1727204692.96067: ANSIBALLZ: Using lock for package_facts 49116 1727204692.96072: ANSIBALLZ: Acquiring lock 49116 1727204692.96075: ANSIBALLZ: Lock acquired: 139720115975904 49116 1727204692.96077: ANSIBALLZ: Creating module 49116 1727204693.32391: ANSIBALLZ: Writing module into payload 49116 1727204693.32590: ANSIBALLZ: Writing module 49116 1727204693.32595: ANSIBALLZ: Renaming module 49116 1727204693.32598: ANSIBALLZ: Done creating module 49116 1727204693.32693: variable 'ansible_facts' from source: unknown 49116 1727204693.32820: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204692.9243128-50168-36993261138429/AnsiballZ_package_facts.py 49116 1727204693.33199: Sending initial data 49116 1727204693.33203: Sent initial data (161 bytes) 49116 1727204693.33773: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204693.33777: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204693.33781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204693.33784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204693.33787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204693.33789: stderr chunk (state=3): >>>debug2: match not found <<< 49116 1727204693.33791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204693.33794: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 49116 1727204693.33797: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 49116 1727204693.33799: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 49116 1727204693.33801: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204693.33803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204693.33816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204693.33819: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204693.33822: stderr chunk (state=3): >>>debug2: match found <<< 49116 1727204693.33831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204693.33907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204693.33920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204693.33942: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204693.34050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204693.35859: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204693.35919: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204693.35995: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpe9xythg7 /root/.ansible/tmp/ansible-tmp-1727204692.9243128-50168-36993261138429/AnsiballZ_package_facts.py <<< 49116 1727204693.35999: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204692.9243128-50168-36993261138429/AnsiballZ_package_facts.py" <<< 49116 1727204693.36058: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpe9xythg7" to remote "/root/.ansible/tmp/ansible-tmp-1727204692.9243128-50168-36993261138429/AnsiballZ_package_facts.py" <<< 49116 1727204693.36066: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204692.9243128-50168-36993261138429/AnsiballZ_package_facts.py" <<< 49116 1727204693.37911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204693.38014: stderr chunk (state=3): >>><<< 49116 1727204693.38019: stdout chunk (state=3): >>><<< 49116 1727204693.38132: done transferring module to remote 49116 1727204693.38136: _low_level_execute_command(): starting 49116 1727204693.38139: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204692.9243128-50168-36993261138429/ /root/.ansible/tmp/ansible-tmp-1727204692.9243128-50168-36993261138429/AnsiballZ_package_facts.py && sleep 0' 49116 1727204693.38762: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204693.38825: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204693.38829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 49116 1727204693.38832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49116 1727204693.38839: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204693.38841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204693.38904: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204693.38920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204693.39007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204693.41173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204693.41178: stdout chunk (state=3): >>><<< 49116 1727204693.41181: stderr chunk (state=3): >>><<< 49116 1727204693.41183: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204693.41186: _low_level_execute_command(): starting 49116 1727204693.41188: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204692.9243128-50168-36993261138429/AnsiballZ_package_facts.py && sleep 0' 49116 1727204693.41916: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204693.41922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204693.41948: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204693.41962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204693.42017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204693.42021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204693.42025: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204693.42104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204694.07432: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 49116 1727204694.07533: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.<<< 49116 1727204694.07548: stdout chunk (state=3): >>>fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarc<<< 49116 1727204694.07642: stdout chunk (state=3): >>>h", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.32", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.32", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 49116 1727204694.09752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204694.09887: stderr chunk (state=3): >>>Shared connection to 10.31.45.169 closed. <<< 49116 1727204694.10114: stderr chunk (state=3): >>><<< 49116 1727204694.10119: stdout chunk (state=3): >>><<< 49116 1727204694.10172: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.32", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.32", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204694.17218: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204692.9243128-50168-36993261138429/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204694.17348: _low_level_execute_command(): starting 49116 1727204694.17352: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204692.9243128-50168-36993261138429/ > /dev/null 2>&1 && sleep 0' 49116 1727204694.18179: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204694.18183: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204694.18186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204694.18189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204694.18227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204694.18232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204694.18235: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204694.18416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204694.20675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204694.20680: stdout chunk (state=3): >>><<< 49116 1727204694.20683: stderr chunk (state=3): >>><<< 49116 1727204694.20687: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204694.20690: handler run complete 49116 1727204694.22004: variable 'ansible_facts' from source: unknown 49116 1727204694.22718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204694.32700: variable 'ansible_facts' from source: unknown 49116 1727204694.33237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204694.34466: attempt loop complete, returning result 49116 1727204694.34504: _execute() done 49116 1727204694.34507: dumping result to json 49116 1727204694.34982: done dumping result, returning 49116 1727204694.34993: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-02f7-957b-0000000004c5] 49116 1727204694.34996: sending task result for task 127b8e07-fff9-02f7-957b-0000000004c5 49116 1727204694.39973: done sending task result for task 127b8e07-fff9-02f7-957b-0000000004c5 49116 1727204694.39983: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 49116 1727204694.40044: no more pending results, returning what we have 49116 1727204694.40047: results queue empty 49116 1727204694.40048: checking for any_errors_fatal 49116 1727204694.40054: done checking for any_errors_fatal 49116 1727204694.40055: checking for max_fail_percentage 49116 1727204694.40056: done checking for max_fail_percentage 49116 1727204694.40057: checking to see if all hosts have failed and the running result is not ok 49116 1727204694.40058: done checking to see if all hosts have failed 49116 1727204694.40059: getting the remaining hosts for this loop 49116 1727204694.40060: done getting the remaining hosts for this loop 49116 1727204694.40065: getting the next task for host managed-node3 49116 1727204694.40074: done getting next task for host managed-node3 49116 1727204694.40077: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 49116 1727204694.40081: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204694.40091: getting variables 49116 1727204694.40112: in VariableManager get_vars() 49116 1727204694.40153: Calling all_inventory to load vars for managed-node3 49116 1727204694.40156: Calling groups_inventory to load vars for managed-node3 49116 1727204694.40158: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204694.40208: Calling all_plugins_play to load vars for managed-node3 49116 1727204694.40212: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204694.40216: Calling groups_plugins_play to load vars for managed-node3 49116 1727204694.42770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204694.45845: done with get_vars() 49116 1727204694.45885: done getting variables 49116 1727204694.45960: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:04:54 -0400 (0:00:01.595) 0:00:17.484 ***** 49116 1727204694.46005: entering _queue_task() for managed-node3/debug 49116 1727204694.46695: worker is 1 (out of 1 available) 49116 1727204694.46782: exiting _queue_task() for managed-node3/debug 49116 1727204694.46794: done queuing things up, now waiting for results queue to drain 49116 1727204694.46796: waiting for pending results... 49116 1727204694.47274: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 49116 1727204694.47616: in run() - task 127b8e07-fff9-02f7-957b-000000000017 49116 1727204694.48073: variable 'ansible_search_path' from source: unknown 49116 1727204694.48077: variable 'ansible_search_path' from source: unknown 49116 1727204694.48080: calling self._execute() 49116 1727204694.48082: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204694.48085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204694.48088: variable 'omit' from source: magic vars 49116 1727204694.48895: variable 'ansible_distribution_major_version' from source: facts 49116 1727204694.48920: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204694.48934: variable 'omit' from source: magic vars 49116 1727204694.49009: variable 'omit' from source: magic vars 49116 1727204694.49303: variable 'network_provider' from source: set_fact 49116 1727204694.49672: variable 'omit' from source: magic vars 49116 1727204694.49677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204694.49680: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204694.49683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204694.49685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204694.49687: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204694.50072: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204694.50079: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204694.50085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204694.50088: Set connection var ansible_connection to ssh 49116 1727204694.50091: Set connection var ansible_timeout to 10 49116 1727204694.50093: Set connection var ansible_shell_executable to /bin/sh 49116 1727204694.50095: Set connection var ansible_pipelining to False 49116 1727204694.50097: Set connection var ansible_shell_type to sh 49116 1727204694.50099: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204694.50277: variable 'ansible_shell_executable' from source: unknown 49116 1727204694.50282: variable 'ansible_connection' from source: unknown 49116 1727204694.50285: variable 'ansible_module_compression' from source: unknown 49116 1727204694.50287: variable 'ansible_shell_type' from source: unknown 49116 1727204694.50290: variable 'ansible_shell_executable' from source: unknown 49116 1727204694.50292: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204694.50296: variable 'ansible_pipelining' from source: unknown 49116 1727204694.50298: variable 'ansible_timeout' from source: unknown 49116 1727204694.50301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204694.50463: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204694.50873: variable 'omit' from source: magic vars 49116 1727204694.50876: starting attempt loop 49116 1727204694.50880: running the handler 49116 1727204694.50882: handler run complete 49116 1727204694.50884: attempt loop complete, returning result 49116 1727204694.50886: _execute() done 49116 1727204694.50888: dumping result to json 49116 1727204694.50890: done dumping result, returning 49116 1727204694.50892: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-02f7-957b-000000000017] 49116 1727204694.50894: sending task result for task 127b8e07-fff9-02f7-957b-000000000017 49116 1727204694.50972: done sending task result for task 127b8e07-fff9-02f7-957b-000000000017 49116 1727204694.50977: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: Using network provider: nm 49116 1727204694.51055: no more pending results, returning what we have 49116 1727204694.51059: results queue empty 49116 1727204694.51060: checking for any_errors_fatal 49116 1727204694.51219: done checking for any_errors_fatal 49116 1727204694.51220: checking for max_fail_percentage 49116 1727204694.51223: done checking for max_fail_percentage 49116 1727204694.51224: checking to see if all hosts have failed and the running result is not ok 49116 1727204694.51225: done checking to see if all hosts have failed 49116 1727204694.51226: getting the remaining hosts for this loop 49116 1727204694.51227: done getting the remaining hosts for this loop 49116 1727204694.51232: getting the next task for host managed-node3 49116 1727204694.51242: done getting next task for host managed-node3 49116 1727204694.51245: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 49116 1727204694.51249: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204694.51262: getting variables 49116 1727204694.51264: in VariableManager get_vars() 49116 1727204694.51311: Calling all_inventory to load vars for managed-node3 49116 1727204694.51370: Calling groups_inventory to load vars for managed-node3 49116 1727204694.51374: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204694.51386: Calling all_plugins_play to load vars for managed-node3 49116 1727204694.51389: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204694.51392: Calling groups_plugins_play to load vars for managed-node3 49116 1727204694.55522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204694.58295: done with get_vars() 49116 1727204694.58458: done getting variables 49116 1727204694.58600: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:04:54 -0400 (0:00:00.126) 0:00:17.611 ***** 49116 1727204694.58755: entering _queue_task() for managed-node3/fail 49116 1727204694.59321: worker is 1 (out of 1 available) 49116 1727204694.59335: exiting _queue_task() for managed-node3/fail 49116 1727204694.59354: done queuing things up, now waiting for results queue to drain 49116 1727204694.59356: waiting for pending results... 49116 1727204694.59580: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 49116 1727204694.59731: in run() - task 127b8e07-fff9-02f7-957b-000000000018 49116 1727204694.59758: variable 'ansible_search_path' from source: unknown 49116 1727204694.59768: variable 'ansible_search_path' from source: unknown 49116 1727204694.59814: calling self._execute() 49116 1727204694.59912: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204694.59925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204694.59943: variable 'omit' from source: magic vars 49116 1727204694.60335: variable 'ansible_distribution_major_version' from source: facts 49116 1727204694.60353: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204694.60489: variable 'network_state' from source: role '' defaults 49116 1727204694.60506: Evaluated conditional (network_state != {}): False 49116 1727204694.60514: when evaluation is False, skipping this task 49116 1727204694.60522: _execute() done 49116 1727204694.60671: dumping result to json 49116 1727204694.60675: done dumping result, returning 49116 1727204694.60678: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-02f7-957b-000000000018] 49116 1727204694.60681: sending task result for task 127b8e07-fff9-02f7-957b-000000000018 49116 1727204694.60767: done sending task result for task 127b8e07-fff9-02f7-957b-000000000018 49116 1727204694.60772: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 49116 1727204694.60834: no more pending results, returning what we have 49116 1727204694.60839: results queue empty 49116 1727204694.60840: checking for any_errors_fatal 49116 1727204694.60849: done checking for any_errors_fatal 49116 1727204694.60849: checking for max_fail_percentage 49116 1727204694.60852: done checking for max_fail_percentage 49116 1727204694.60853: checking to see if all hosts have failed and the running result is not ok 49116 1727204694.60854: done checking to see if all hosts have failed 49116 1727204694.60855: getting the remaining hosts for this loop 49116 1727204694.60856: done getting the remaining hosts for this loop 49116 1727204694.60861: getting the next task for host managed-node3 49116 1727204694.60869: done getting next task for host managed-node3 49116 1727204694.60874: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 49116 1727204694.60937: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204694.60955: getting variables 49116 1727204694.60956: in VariableManager get_vars() 49116 1727204694.61039: Calling all_inventory to load vars for managed-node3 49116 1727204694.61042: Calling groups_inventory to load vars for managed-node3 49116 1727204694.61045: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204694.61055: Calling all_plugins_play to load vars for managed-node3 49116 1727204694.61058: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204694.61061: Calling groups_plugins_play to load vars for managed-node3 49116 1727204694.63941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204694.66714: done with get_vars() 49116 1727204694.66758: done getting variables 49116 1727204694.66830: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:04:54 -0400 (0:00:00.082) 0:00:17.693 ***** 49116 1727204694.66883: entering _queue_task() for managed-node3/fail 49116 1727204694.67409: worker is 1 (out of 1 available) 49116 1727204694.67422: exiting _queue_task() for managed-node3/fail 49116 1727204694.67437: done queuing things up, now waiting for results queue to drain 49116 1727204694.67439: waiting for pending results... 49116 1727204694.67831: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 49116 1727204694.68172: in run() - task 127b8e07-fff9-02f7-957b-000000000019 49116 1727204694.68176: variable 'ansible_search_path' from source: unknown 49116 1727204694.68179: variable 'ansible_search_path' from source: unknown 49116 1727204694.68316: calling self._execute() 49116 1727204694.68587: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204694.68592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204694.68595: variable 'omit' from source: magic vars 49116 1727204694.69268: variable 'ansible_distribution_major_version' from source: facts 49116 1727204694.69304: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204694.69585: variable 'network_state' from source: role '' defaults 49116 1727204694.69617: Evaluated conditional (network_state != {}): False 49116 1727204694.69621: when evaluation is False, skipping this task 49116 1727204694.69624: _execute() done 49116 1727204694.69628: dumping result to json 49116 1727204694.69630: done dumping result, returning 49116 1727204694.69640: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-02f7-957b-000000000019] 49116 1727204694.69771: sending task result for task 127b8e07-fff9-02f7-957b-000000000019 49116 1727204694.69849: done sending task result for task 127b8e07-fff9-02f7-957b-000000000019 49116 1727204694.69853: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 49116 1727204694.69915: no more pending results, returning what we have 49116 1727204694.69919: results queue empty 49116 1727204694.69920: checking for any_errors_fatal 49116 1727204694.69927: done checking for any_errors_fatal 49116 1727204694.69928: checking for max_fail_percentage 49116 1727204694.69930: done checking for max_fail_percentage 49116 1727204694.69931: checking to see if all hosts have failed and the running result is not ok 49116 1727204694.69932: done checking to see if all hosts have failed 49116 1727204694.69935: getting the remaining hosts for this loop 49116 1727204694.69936: done getting the remaining hosts for this loop 49116 1727204694.69941: getting the next task for host managed-node3 49116 1727204694.69947: done getting next task for host managed-node3 49116 1727204694.69951: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 49116 1727204694.69955: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204694.70080: getting variables 49116 1727204694.70082: in VariableManager get_vars() 49116 1727204694.70126: Calling all_inventory to load vars for managed-node3 49116 1727204694.70129: Calling groups_inventory to load vars for managed-node3 49116 1727204694.70131: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204694.70144: Calling all_plugins_play to load vars for managed-node3 49116 1727204694.70147: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204694.70150: Calling groups_plugins_play to load vars for managed-node3 49116 1727204694.72102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204694.74386: done with get_vars() 49116 1727204694.74423: done getting variables 49116 1727204694.74481: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:04:54 -0400 (0:00:00.076) 0:00:17.770 ***** 49116 1727204694.74512: entering _queue_task() for managed-node3/fail 49116 1727204694.74794: worker is 1 (out of 1 available) 49116 1727204694.74809: exiting _queue_task() for managed-node3/fail 49116 1727204694.74823: done queuing things up, now waiting for results queue to drain 49116 1727204694.74825: waiting for pending results... 49116 1727204694.75025: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 49116 1727204694.75126: in run() - task 127b8e07-fff9-02f7-957b-00000000001a 49116 1727204694.75142: variable 'ansible_search_path' from source: unknown 49116 1727204694.75145: variable 'ansible_search_path' from source: unknown 49116 1727204694.75183: calling self._execute() 49116 1727204694.75285: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204694.75290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204694.75294: variable 'omit' from source: magic vars 49116 1727204694.75672: variable 'ansible_distribution_major_version' from source: facts 49116 1727204694.75676: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204694.75813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204694.77767: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204694.77820: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204694.77852: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204694.77883: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204694.77904: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204694.77978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204694.78003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204694.78022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204694.78054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204694.78070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204694.78152: variable 'ansible_distribution_major_version' from source: facts 49116 1727204694.78167: Evaluated conditional (ansible_distribution_major_version | int > 9): True 49116 1727204694.78258: variable 'ansible_distribution' from source: facts 49116 1727204694.78262: variable '__network_rh_distros' from source: role '' defaults 49116 1727204694.78272: Evaluated conditional (ansible_distribution in __network_rh_distros): False 49116 1727204694.78275: when evaluation is False, skipping this task 49116 1727204694.78278: _execute() done 49116 1727204694.78281: dumping result to json 49116 1727204694.78284: done dumping result, returning 49116 1727204694.78294: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-02f7-957b-00000000001a] 49116 1727204694.78297: sending task result for task 127b8e07-fff9-02f7-957b-00000000001a 49116 1727204694.78393: done sending task result for task 127b8e07-fff9-02f7-957b-00000000001a 49116 1727204694.78396: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 49116 1727204694.78457: no more pending results, returning what we have 49116 1727204694.78460: results queue empty 49116 1727204694.78462: checking for any_errors_fatal 49116 1727204694.78476: done checking for any_errors_fatal 49116 1727204694.78477: checking for max_fail_percentage 49116 1727204694.78479: done checking for max_fail_percentage 49116 1727204694.78480: checking to see if all hosts have failed and the running result is not ok 49116 1727204694.78481: done checking to see if all hosts have failed 49116 1727204694.78482: getting the remaining hosts for this loop 49116 1727204694.78483: done getting the remaining hosts for this loop 49116 1727204694.78488: getting the next task for host managed-node3 49116 1727204694.78495: done getting next task for host managed-node3 49116 1727204694.78499: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 49116 1727204694.78502: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204694.78517: getting variables 49116 1727204694.78519: in VariableManager get_vars() 49116 1727204694.78562: Calling all_inventory to load vars for managed-node3 49116 1727204694.78566: Calling groups_inventory to load vars for managed-node3 49116 1727204694.78569: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204694.78585: Calling all_plugins_play to load vars for managed-node3 49116 1727204694.78588: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204694.78591: Calling groups_plugins_play to load vars for managed-node3 49116 1727204694.79721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204694.80904: done with get_vars() 49116 1727204694.80931: done getting variables 49116 1727204694.81020: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:04:54 -0400 (0:00:00.065) 0:00:17.835 ***** 49116 1727204694.81047: entering _queue_task() for managed-node3/dnf 49116 1727204694.81323: worker is 1 (out of 1 available) 49116 1727204694.81337: exiting _queue_task() for managed-node3/dnf 49116 1727204694.81359: done queuing things up, now waiting for results queue to drain 49116 1727204694.81361: waiting for pending results... 49116 1727204694.81554: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 49116 1727204694.81684: in run() - task 127b8e07-fff9-02f7-957b-00000000001b 49116 1727204694.81699: variable 'ansible_search_path' from source: unknown 49116 1727204694.81702: variable 'ansible_search_path' from source: unknown 49116 1727204694.81739: calling self._execute() 49116 1727204694.81807: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204694.81819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204694.81822: variable 'omit' from source: magic vars 49116 1727204694.82149: variable 'ansible_distribution_major_version' from source: facts 49116 1727204694.82157: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204694.82315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204694.84580: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204694.84589: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204694.84627: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204694.84663: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204694.84691: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204694.84778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204694.84806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204694.84830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204694.84873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204694.84887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204694.85013: variable 'ansible_distribution' from source: facts 49116 1727204694.85017: variable 'ansible_distribution_major_version' from source: facts 49116 1727204694.85020: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 49116 1727204694.85143: variable '__network_wireless_connections_defined' from source: role '' defaults 49116 1727204694.85280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204694.85303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204694.85326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204694.85373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204694.85386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204694.85418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204694.85435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204694.85460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204694.85489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204694.85500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204694.85531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204694.85555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204694.85585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204694.85612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204694.85623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204694.85746: variable 'network_connections' from source: task vars 49116 1727204694.85758: variable 'interface' from source: play vars 49116 1727204694.85818: variable 'interface' from source: play vars 49116 1727204694.85828: variable 'vlan_interface' from source: play vars 49116 1727204694.85880: variable 'vlan_interface' from source: play vars 49116 1727204694.85883: variable 'interface' from source: play vars 49116 1727204694.85930: variable 'interface' from source: play vars 49116 1727204694.85990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49116 1727204694.86124: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49116 1727204694.86157: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49116 1727204694.86195: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49116 1727204694.86221: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49116 1727204694.86258: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49116 1727204694.86280: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49116 1727204694.86298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204694.86317: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49116 1727204694.86373: variable '__network_team_connections_defined' from source: role '' defaults 49116 1727204694.86548: variable 'network_connections' from source: task vars 49116 1727204694.86553: variable 'interface' from source: play vars 49116 1727204694.86600: variable 'interface' from source: play vars 49116 1727204694.86608: variable 'vlan_interface' from source: play vars 49116 1727204694.86656: variable 'vlan_interface' from source: play vars 49116 1727204694.86659: variable 'interface' from source: play vars 49116 1727204694.86708: variable 'interface' from source: play vars 49116 1727204694.86734: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 49116 1727204694.86740: when evaluation is False, skipping this task 49116 1727204694.86744: _execute() done 49116 1727204694.86746: dumping result to json 49116 1727204694.86749: done dumping result, returning 49116 1727204694.86759: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-02f7-957b-00000000001b] 49116 1727204694.86762: sending task result for task 127b8e07-fff9-02f7-957b-00000000001b 49116 1727204694.86863: done sending task result for task 127b8e07-fff9-02f7-957b-00000000001b 49116 1727204694.86868: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 49116 1727204694.86930: no more pending results, returning what we have 49116 1727204694.86933: results queue empty 49116 1727204694.86935: checking for any_errors_fatal 49116 1727204694.86942: done checking for any_errors_fatal 49116 1727204694.86943: checking for max_fail_percentage 49116 1727204694.86944: done checking for max_fail_percentage 49116 1727204694.86945: checking to see if all hosts have failed and the running result is not ok 49116 1727204694.86946: done checking to see if all hosts have failed 49116 1727204694.86947: getting the remaining hosts for this loop 49116 1727204694.86948: done getting the remaining hosts for this loop 49116 1727204694.86953: getting the next task for host managed-node3 49116 1727204694.86959: done getting next task for host managed-node3 49116 1727204694.86963: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 49116 1727204694.86969: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204694.86985: getting variables 49116 1727204694.86987: in VariableManager get_vars() 49116 1727204694.87030: Calling all_inventory to load vars for managed-node3 49116 1727204694.87033: Calling groups_inventory to load vars for managed-node3 49116 1727204694.87035: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204694.87045: Calling all_plugins_play to load vars for managed-node3 49116 1727204694.87048: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204694.87050: Calling groups_plugins_play to load vars for managed-node3 49116 1727204694.88081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204694.90048: done with get_vars() 49116 1727204694.90087: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 49116 1727204694.90181: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:04:54 -0400 (0:00:00.091) 0:00:17.927 ***** 49116 1727204694.90217: entering _queue_task() for managed-node3/yum 49116 1727204694.90219: Creating lock for yum 49116 1727204694.90605: worker is 1 (out of 1 available) 49116 1727204694.90620: exiting _queue_task() for managed-node3/yum 49116 1727204694.90634: done queuing things up, now waiting for results queue to drain 49116 1727204694.90635: waiting for pending results... 49116 1727204694.91091: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 49116 1727204694.91135: in run() - task 127b8e07-fff9-02f7-957b-00000000001c 49116 1727204694.91161: variable 'ansible_search_path' from source: unknown 49116 1727204694.91177: variable 'ansible_search_path' from source: unknown 49116 1727204694.91231: calling self._execute() 49116 1727204694.91343: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204694.91359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204694.91379: variable 'omit' from source: magic vars 49116 1727204694.91867: variable 'ansible_distribution_major_version' from source: facts 49116 1727204694.91873: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204694.92048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204694.98421: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204694.98469: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204694.98497: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204694.98526: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204694.98555: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204694.98616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204694.98639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204694.98657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204694.98687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204694.98699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204694.98775: variable 'ansible_distribution_major_version' from source: facts 49116 1727204694.98788: Evaluated conditional (ansible_distribution_major_version | int < 8): False 49116 1727204694.98791: when evaluation is False, skipping this task 49116 1727204694.98794: _execute() done 49116 1727204694.98796: dumping result to json 49116 1727204694.98799: done dumping result, returning 49116 1727204694.98807: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-02f7-957b-00000000001c] 49116 1727204694.98810: sending task result for task 127b8e07-fff9-02f7-957b-00000000001c 49116 1727204694.98911: done sending task result for task 127b8e07-fff9-02f7-957b-00000000001c 49116 1727204694.98914: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 49116 1727204694.98986: no more pending results, returning what we have 49116 1727204694.98990: results queue empty 49116 1727204694.98991: checking for any_errors_fatal 49116 1727204694.98997: done checking for any_errors_fatal 49116 1727204694.98998: checking for max_fail_percentage 49116 1727204694.99000: done checking for max_fail_percentage 49116 1727204694.99001: checking to see if all hosts have failed and the running result is not ok 49116 1727204694.99002: done checking to see if all hosts have failed 49116 1727204694.99002: getting the remaining hosts for this loop 49116 1727204694.99004: done getting the remaining hosts for this loop 49116 1727204694.99008: getting the next task for host managed-node3 49116 1727204694.99015: done getting next task for host managed-node3 49116 1727204694.99019: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 49116 1727204694.99023: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204694.99041: getting variables 49116 1727204694.99042: in VariableManager get_vars() 49116 1727204694.99087: Calling all_inventory to load vars for managed-node3 49116 1727204694.99090: Calling groups_inventory to load vars for managed-node3 49116 1727204694.99092: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204694.99102: Calling all_plugins_play to load vars for managed-node3 49116 1727204694.99105: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204694.99108: Calling groups_plugins_play to load vars for managed-node3 49116 1727204695.04098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204695.05572: done with get_vars() 49116 1727204695.05600: done getting variables 49116 1727204695.05646: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:04:55 -0400 (0:00:00.154) 0:00:18.081 ***** 49116 1727204695.05676: entering _queue_task() for managed-node3/fail 49116 1727204695.05968: worker is 1 (out of 1 available) 49116 1727204695.05984: exiting _queue_task() for managed-node3/fail 49116 1727204695.05997: done queuing things up, now waiting for results queue to drain 49116 1727204695.05999: waiting for pending results... 49116 1727204695.06196: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 49116 1727204695.06299: in run() - task 127b8e07-fff9-02f7-957b-00000000001d 49116 1727204695.06311: variable 'ansible_search_path' from source: unknown 49116 1727204695.06316: variable 'ansible_search_path' from source: unknown 49116 1727204695.06353: calling self._execute() 49116 1727204695.06559: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204695.06563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204695.06573: variable 'omit' from source: magic vars 49116 1727204695.07074: variable 'ansible_distribution_major_version' from source: facts 49116 1727204695.07078: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204695.07193: variable '__network_wireless_connections_defined' from source: role '' defaults 49116 1727204695.07416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204695.10074: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204695.10280: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204695.10333: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204695.10381: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204695.10413: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204695.10510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204695.10551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204695.10596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204695.10644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204695.10870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204695.10874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204695.10876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204695.10878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204695.10880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204695.10882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204695.10884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204695.10890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204695.10918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204695.10962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204695.10982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204695.11163: variable 'network_connections' from source: task vars 49116 1727204695.11186: variable 'interface' from source: play vars 49116 1727204695.11275: variable 'interface' from source: play vars 49116 1727204695.11295: variable 'vlan_interface' from source: play vars 49116 1727204695.11370: variable 'vlan_interface' from source: play vars 49116 1727204695.11384: variable 'interface' from source: play vars 49116 1727204695.11452: variable 'interface' from source: play vars 49116 1727204695.11543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49116 1727204695.11754: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49116 1727204695.11801: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49116 1727204695.11839: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49116 1727204695.11875: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49116 1727204695.11928: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49116 1727204695.11955: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49116 1727204695.11987: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204695.12019: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49116 1727204695.12092: variable '__network_team_connections_defined' from source: role '' defaults 49116 1727204695.12346: variable 'network_connections' from source: task vars 49116 1727204695.12358: variable 'interface' from source: play vars 49116 1727204695.12432: variable 'interface' from source: play vars 49116 1727204695.12449: variable 'vlan_interface' from source: play vars 49116 1727204695.12516: variable 'vlan_interface' from source: play vars 49116 1727204695.12527: variable 'interface' from source: play vars 49116 1727204695.12589: variable 'interface' from source: play vars 49116 1727204695.12630: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 49116 1727204695.12648: when evaluation is False, skipping this task 49116 1727204695.12655: _execute() done 49116 1727204695.12661: dumping result to json 49116 1727204695.12670: done dumping result, returning 49116 1727204695.12681: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-02f7-957b-00000000001d] 49116 1727204695.12690: sending task result for task 127b8e07-fff9-02f7-957b-00000000001d skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 49116 1727204695.12997: no more pending results, returning what we have 49116 1727204695.13001: results queue empty 49116 1727204695.13002: checking for any_errors_fatal 49116 1727204695.13010: done checking for any_errors_fatal 49116 1727204695.13010: checking for max_fail_percentage 49116 1727204695.13012: done checking for max_fail_percentage 49116 1727204695.13013: checking to see if all hosts have failed and the running result is not ok 49116 1727204695.13014: done checking to see if all hosts have failed 49116 1727204695.13015: getting the remaining hosts for this loop 49116 1727204695.13016: done getting the remaining hosts for this loop 49116 1727204695.13021: getting the next task for host managed-node3 49116 1727204695.13027: done getting next task for host managed-node3 49116 1727204695.13031: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 49116 1727204695.13036: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204695.13050: done sending task result for task 127b8e07-fff9-02f7-957b-00000000001d 49116 1727204695.13054: WORKER PROCESS EXITING 49116 1727204695.13063: getting variables 49116 1727204695.13066: in VariableManager get_vars() 49116 1727204695.13106: Calling all_inventory to load vars for managed-node3 49116 1727204695.13109: Calling groups_inventory to load vars for managed-node3 49116 1727204695.13111: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204695.13122: Calling all_plugins_play to load vars for managed-node3 49116 1727204695.13124: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204695.13127: Calling groups_plugins_play to load vars for managed-node3 49116 1727204695.15344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204695.17759: done with get_vars() 49116 1727204695.17803: done getting variables 49116 1727204695.17883: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:04:55 -0400 (0:00:00.122) 0:00:18.204 ***** 49116 1727204695.17922: entering _queue_task() for managed-node3/package 49116 1727204695.18411: worker is 1 (out of 1 available) 49116 1727204695.18424: exiting _queue_task() for managed-node3/package 49116 1727204695.18438: done queuing things up, now waiting for results queue to drain 49116 1727204695.18440: waiting for pending results... 49116 1727204695.18679: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 49116 1727204695.18823: in run() - task 127b8e07-fff9-02f7-957b-00000000001e 49116 1727204695.18842: variable 'ansible_search_path' from source: unknown 49116 1727204695.18845: variable 'ansible_search_path' from source: unknown 49116 1727204695.18887: calling self._execute() 49116 1727204695.18999: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204695.19171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204695.19175: variable 'omit' from source: magic vars 49116 1727204695.19476: variable 'ansible_distribution_major_version' from source: facts 49116 1727204695.19488: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204695.19720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49116 1727204695.20030: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49116 1727204695.20084: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49116 1727204695.20173: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49116 1727204695.20210: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49116 1727204695.20359: variable 'network_packages' from source: role '' defaults 49116 1727204695.20489: variable '__network_provider_setup' from source: role '' defaults 49116 1727204695.20501: variable '__network_service_name_default_nm' from source: role '' defaults 49116 1727204695.20582: variable '__network_service_name_default_nm' from source: role '' defaults 49116 1727204695.20592: variable '__network_packages_default_nm' from source: role '' defaults 49116 1727204695.20652: variable '__network_packages_default_nm' from source: role '' defaults 49116 1727204695.20859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204695.23340: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204695.23413: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204695.23454: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204695.23496: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204695.23524: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204695.23645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204695.23675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204695.23701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204695.23754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204695.23769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204695.23970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204695.23974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204695.23976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204695.23979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204695.23981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204695.24210: variable '__network_packages_default_gobject_packages' from source: role '' defaults 49116 1727204695.24348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204695.24381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204695.24410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204695.24455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204695.24473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204695.24587: variable 'ansible_python' from source: facts 49116 1727204695.24620: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 49116 1727204695.24725: variable '__network_wpa_supplicant_required' from source: role '' defaults 49116 1727204695.24822: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 49116 1727204695.24981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204695.25007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204695.25043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204695.25090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204695.25104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204695.25167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204695.25191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204695.25217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204695.25270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204695.25285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204695.25570: variable 'network_connections' from source: task vars 49116 1727204695.25574: variable 'interface' from source: play vars 49116 1727204695.25584: variable 'interface' from source: play vars 49116 1727204695.25599: variable 'vlan_interface' from source: play vars 49116 1727204695.25713: variable 'vlan_interface' from source: play vars 49116 1727204695.25721: variable 'interface' from source: play vars 49116 1727204695.25834: variable 'interface' from source: play vars 49116 1727204695.25929: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49116 1727204695.25960: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49116 1727204695.26002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204695.26043: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49116 1727204695.26095: variable '__network_wireless_connections_defined' from source: role '' defaults 49116 1727204695.26446: variable 'network_connections' from source: task vars 49116 1727204695.26452: variable 'interface' from source: play vars 49116 1727204695.26570: variable 'interface' from source: play vars 49116 1727204695.26584: variable 'vlan_interface' from source: play vars 49116 1727204695.26698: variable 'vlan_interface' from source: play vars 49116 1727204695.26707: variable 'interface' from source: play vars 49116 1727204695.26824: variable 'interface' from source: play vars 49116 1727204695.26900: variable '__network_packages_default_wireless' from source: role '' defaults 49116 1727204695.27170: variable '__network_wireless_connections_defined' from source: role '' defaults 49116 1727204695.27352: variable 'network_connections' from source: task vars 49116 1727204695.27355: variable 'interface' from source: play vars 49116 1727204695.27440: variable 'interface' from source: play vars 49116 1727204695.27451: variable 'vlan_interface' from source: play vars 49116 1727204695.27527: variable 'vlan_interface' from source: play vars 49116 1727204695.27533: variable 'interface' from source: play vars 49116 1727204695.27605: variable 'interface' from source: play vars 49116 1727204695.27648: variable '__network_packages_default_team' from source: role '' defaults 49116 1727204695.27739: variable '__network_team_connections_defined' from source: role '' defaults 49116 1727204695.28083: variable 'network_connections' from source: task vars 49116 1727204695.28087: variable 'interface' from source: play vars 49116 1727204695.28151: variable 'interface' from source: play vars 49116 1727204695.28169: variable 'vlan_interface' from source: play vars 49116 1727204695.28228: variable 'vlan_interface' from source: play vars 49116 1727204695.28236: variable 'interface' from source: play vars 49116 1727204695.28394: variable 'interface' from source: play vars 49116 1727204695.28481: variable '__network_service_name_default_initscripts' from source: role '' defaults 49116 1727204695.28553: variable '__network_service_name_default_initscripts' from source: role '' defaults 49116 1727204695.28568: variable '__network_packages_default_initscripts' from source: role '' defaults 49116 1727204695.28670: variable '__network_packages_default_initscripts' from source: role '' defaults 49116 1727204695.28963: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 49116 1727204695.29626: variable 'network_connections' from source: task vars 49116 1727204695.29648: variable 'interface' from source: play vars 49116 1727204695.29735: variable 'interface' from source: play vars 49116 1727204695.29760: variable 'vlan_interface' from source: play vars 49116 1727204695.29853: variable 'vlan_interface' from source: play vars 49116 1727204695.29874: variable 'interface' from source: play vars 49116 1727204695.29960: variable 'interface' from source: play vars 49116 1727204695.30023: variable 'ansible_distribution' from source: facts 49116 1727204695.30027: variable '__network_rh_distros' from source: role '' defaults 49116 1727204695.30029: variable 'ansible_distribution_major_version' from source: facts 49116 1727204695.30032: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 49116 1727204695.30281: variable 'ansible_distribution' from source: facts 49116 1727204695.30293: variable '__network_rh_distros' from source: role '' defaults 49116 1727204695.30304: variable 'ansible_distribution_major_version' from source: facts 49116 1727204695.30335: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 49116 1727204695.30541: variable 'ansible_distribution' from source: facts 49116 1727204695.30551: variable '__network_rh_distros' from source: role '' defaults 49116 1727204695.30562: variable 'ansible_distribution_major_version' from source: facts 49116 1727204695.30649: variable 'network_provider' from source: set_fact 49116 1727204695.30726: variable 'ansible_facts' from source: unknown 49116 1727204695.31963: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 49116 1727204695.31969: when evaluation is False, skipping this task 49116 1727204695.31972: _execute() done 49116 1727204695.31974: dumping result to json 49116 1727204695.31976: done dumping result, returning 49116 1727204695.31982: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-02f7-957b-00000000001e] 49116 1727204695.31987: sending task result for task 127b8e07-fff9-02f7-957b-00000000001e skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 49116 1727204695.32154: no more pending results, returning what we have 49116 1727204695.32158: results queue empty 49116 1727204695.32160: checking for any_errors_fatal 49116 1727204695.32172: done checking for any_errors_fatal 49116 1727204695.32173: checking for max_fail_percentage 49116 1727204695.32175: done checking for max_fail_percentage 49116 1727204695.32176: checking to see if all hosts have failed and the running result is not ok 49116 1727204695.32177: done checking to see if all hosts have failed 49116 1727204695.32178: getting the remaining hosts for this loop 49116 1727204695.32180: done getting the remaining hosts for this loop 49116 1727204695.32185: getting the next task for host managed-node3 49116 1727204695.32193: done getting next task for host managed-node3 49116 1727204695.32197: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 49116 1727204695.32323: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204695.32345: getting variables 49116 1727204695.32348: in VariableManager get_vars() 49116 1727204695.32205: done sending task result for task 127b8e07-fff9-02f7-957b-00000000001e 49116 1727204695.32405: WORKER PROCESS EXITING 49116 1727204695.32397: Calling all_inventory to load vars for managed-node3 49116 1727204695.32575: Calling groups_inventory to load vars for managed-node3 49116 1727204695.32579: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204695.32591: Calling all_plugins_play to load vars for managed-node3 49116 1727204695.32594: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204695.32597: Calling groups_plugins_play to load vars for managed-node3 49116 1727204695.34458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204695.37032: done with get_vars() 49116 1727204695.37069: done getting variables 49116 1727204695.37147: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:04:55 -0400 (0:00:00.192) 0:00:18.396 ***** 49116 1727204695.37188: entering _queue_task() for managed-node3/package 49116 1727204695.37794: worker is 1 (out of 1 available) 49116 1727204695.37808: exiting _queue_task() for managed-node3/package 49116 1727204695.37821: done queuing things up, now waiting for results queue to drain 49116 1727204695.37823: waiting for pending results... 49116 1727204695.37982: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 49116 1727204695.38126: in run() - task 127b8e07-fff9-02f7-957b-00000000001f 49116 1727204695.38144: variable 'ansible_search_path' from source: unknown 49116 1727204695.38148: variable 'ansible_search_path' from source: unknown 49116 1727204695.38274: calling self._execute() 49116 1727204695.38315: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204695.38322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204695.38333: variable 'omit' from source: magic vars 49116 1727204695.38972: variable 'ansible_distribution_major_version' from source: facts 49116 1727204695.38977: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204695.38980: variable 'network_state' from source: role '' defaults 49116 1727204695.38984: Evaluated conditional (network_state != {}): False 49116 1727204695.38986: when evaluation is False, skipping this task 49116 1727204695.38988: _execute() done 49116 1727204695.38991: dumping result to json 49116 1727204695.38993: done dumping result, returning 49116 1727204695.38996: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-02f7-957b-00000000001f] 49116 1727204695.38999: sending task result for task 127b8e07-fff9-02f7-957b-00000000001f skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 49116 1727204695.39156: no more pending results, returning what we have 49116 1727204695.39161: results queue empty 49116 1727204695.39162: checking for any_errors_fatal 49116 1727204695.39171: done checking for any_errors_fatal 49116 1727204695.39172: checking for max_fail_percentage 49116 1727204695.39174: done checking for max_fail_percentage 49116 1727204695.39175: checking to see if all hosts have failed and the running result is not ok 49116 1727204695.39176: done checking to see if all hosts have failed 49116 1727204695.39176: getting the remaining hosts for this loop 49116 1727204695.39178: done getting the remaining hosts for this loop 49116 1727204695.39183: getting the next task for host managed-node3 49116 1727204695.39190: done getting next task for host managed-node3 49116 1727204695.39194: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 49116 1727204695.39198: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204695.39217: getting variables 49116 1727204695.39219: in VariableManager get_vars() 49116 1727204695.39388: Calling all_inventory to load vars for managed-node3 49116 1727204695.39391: Calling groups_inventory to load vars for managed-node3 49116 1727204695.39394: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204695.39408: Calling all_plugins_play to load vars for managed-node3 49116 1727204695.39411: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204695.39414: Calling groups_plugins_play to load vars for managed-node3 49116 1727204695.40012: done sending task result for task 127b8e07-fff9-02f7-957b-00000000001f 49116 1727204695.40674: WORKER PROCESS EXITING 49116 1727204695.41371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204695.43575: done with get_vars() 49116 1727204695.43618: done getting variables 49116 1727204695.43693: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:04:55 -0400 (0:00:00.065) 0:00:18.462 ***** 49116 1727204695.43733: entering _queue_task() for managed-node3/package 49116 1727204695.44111: worker is 1 (out of 1 available) 49116 1727204695.44124: exiting _queue_task() for managed-node3/package 49116 1727204695.44138: done queuing things up, now waiting for results queue to drain 49116 1727204695.44140: waiting for pending results... 49116 1727204695.44501: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 49116 1727204695.44708: in run() - task 127b8e07-fff9-02f7-957b-000000000020 49116 1727204695.44714: variable 'ansible_search_path' from source: unknown 49116 1727204695.44716: variable 'ansible_search_path' from source: unknown 49116 1727204695.44768: calling self._execute() 49116 1727204695.44906: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204695.44909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204695.45073: variable 'omit' from source: magic vars 49116 1727204695.45418: variable 'ansible_distribution_major_version' from source: facts 49116 1727204695.45440: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204695.45588: variable 'network_state' from source: role '' defaults 49116 1727204695.45627: Evaluated conditional (network_state != {}): False 49116 1727204695.45637: when evaluation is False, skipping this task 49116 1727204695.45645: _execute() done 49116 1727204695.45653: dumping result to json 49116 1727204695.45661: done dumping result, returning 49116 1727204695.45676: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-02f7-957b-000000000020] 49116 1727204695.45687: sending task result for task 127b8e07-fff9-02f7-957b-000000000020 49116 1727204695.46005: done sending task result for task 127b8e07-fff9-02f7-957b-000000000020 49116 1727204695.46010: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 49116 1727204695.46155: no more pending results, returning what we have 49116 1727204695.46160: results queue empty 49116 1727204695.46161: checking for any_errors_fatal 49116 1727204695.46175: done checking for any_errors_fatal 49116 1727204695.46176: checking for max_fail_percentage 49116 1727204695.46178: done checking for max_fail_percentage 49116 1727204695.46179: checking to see if all hosts have failed and the running result is not ok 49116 1727204695.46181: done checking to see if all hosts have failed 49116 1727204695.46181: getting the remaining hosts for this loop 49116 1727204695.46183: done getting the remaining hosts for this loop 49116 1727204695.46189: getting the next task for host managed-node3 49116 1727204695.46198: done getting next task for host managed-node3 49116 1727204695.46203: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 49116 1727204695.46207: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204695.46227: getting variables 49116 1727204695.46229: in VariableManager get_vars() 49116 1727204695.46426: Calling all_inventory to load vars for managed-node3 49116 1727204695.46430: Calling groups_inventory to load vars for managed-node3 49116 1727204695.46432: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204695.46443: Calling all_plugins_play to load vars for managed-node3 49116 1727204695.46446: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204695.46449: Calling groups_plugins_play to load vars for managed-node3 49116 1727204695.48471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204695.50831: done with get_vars() 49116 1727204695.50879: done getting variables 49116 1727204695.51001: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:04:55 -0400 (0:00:00.073) 0:00:18.535 ***** 49116 1727204695.51043: entering _queue_task() for managed-node3/service 49116 1727204695.51045: Creating lock for service 49116 1727204695.51442: worker is 1 (out of 1 available) 49116 1727204695.51458: exiting _queue_task() for managed-node3/service 49116 1727204695.51678: done queuing things up, now waiting for results queue to drain 49116 1727204695.51680: waiting for pending results... 49116 1727204695.51841: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 49116 1727204695.52186: in run() - task 127b8e07-fff9-02f7-957b-000000000021 49116 1727204695.52190: variable 'ansible_search_path' from source: unknown 49116 1727204695.52193: variable 'ansible_search_path' from source: unknown 49116 1727204695.52196: calling self._execute() 49116 1727204695.52199: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204695.52201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204695.52204: variable 'omit' from source: magic vars 49116 1727204695.52699: variable 'ansible_distribution_major_version' from source: facts 49116 1727204695.52710: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204695.52861: variable '__network_wireless_connections_defined' from source: role '' defaults 49116 1727204695.53111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204695.55516: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204695.55587: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204695.55619: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204695.55651: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204695.55675: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204695.55746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204695.55770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204695.55792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204695.55824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204695.55836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204695.55877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204695.55897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204695.55916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204695.55948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204695.55960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204695.55995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204695.56013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204695.56033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204695.56064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204695.56076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204695.56207: variable 'network_connections' from source: task vars 49116 1727204695.56219: variable 'interface' from source: play vars 49116 1727204695.56288: variable 'interface' from source: play vars 49116 1727204695.56298: variable 'vlan_interface' from source: play vars 49116 1727204695.56349: variable 'vlan_interface' from source: play vars 49116 1727204695.56355: variable 'interface' from source: play vars 49116 1727204695.56404: variable 'interface' from source: play vars 49116 1727204695.56463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49116 1727204695.56607: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49116 1727204695.56636: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49116 1727204695.56679: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49116 1727204695.56715: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49116 1727204695.56745: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49116 1727204695.56902: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49116 1727204695.56905: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204695.56908: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49116 1727204695.56910: variable '__network_team_connections_defined' from source: role '' defaults 49116 1727204695.57373: variable 'network_connections' from source: task vars 49116 1727204695.57376: variable 'interface' from source: play vars 49116 1727204695.57379: variable 'interface' from source: play vars 49116 1727204695.57381: variable 'vlan_interface' from source: play vars 49116 1727204695.57383: variable 'vlan_interface' from source: play vars 49116 1727204695.57386: variable 'interface' from source: play vars 49116 1727204695.57388: variable 'interface' from source: play vars 49116 1727204695.57422: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 49116 1727204695.57433: when evaluation is False, skipping this task 49116 1727204695.57435: _execute() done 49116 1727204695.57438: dumping result to json 49116 1727204695.57440: done dumping result, returning 49116 1727204695.57443: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-02f7-957b-000000000021] 49116 1727204695.57449: sending task result for task 127b8e07-fff9-02f7-957b-000000000021 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 49116 1727204695.57661: no more pending results, returning what we have 49116 1727204695.57666: results queue empty 49116 1727204695.57668: checking for any_errors_fatal 49116 1727204695.57677: done checking for any_errors_fatal 49116 1727204695.57678: checking for max_fail_percentage 49116 1727204695.57679: done checking for max_fail_percentage 49116 1727204695.57680: checking to see if all hosts have failed and the running result is not ok 49116 1727204695.57681: done checking to see if all hosts have failed 49116 1727204695.57682: getting the remaining hosts for this loop 49116 1727204695.57683: done getting the remaining hosts for this loop 49116 1727204695.57687: getting the next task for host managed-node3 49116 1727204695.57694: done getting next task for host managed-node3 49116 1727204695.57698: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 49116 1727204695.57701: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204695.57723: done sending task result for task 127b8e07-fff9-02f7-957b-000000000021 49116 1727204695.57726: WORKER PROCESS EXITING 49116 1727204695.57771: getting variables 49116 1727204695.57773: in VariableManager get_vars() 49116 1727204695.57814: Calling all_inventory to load vars for managed-node3 49116 1727204695.57817: Calling groups_inventory to load vars for managed-node3 49116 1727204695.57819: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204695.57859: Calling all_plugins_play to load vars for managed-node3 49116 1727204695.57863: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204695.57869: Calling groups_plugins_play to load vars for managed-node3 49116 1727204695.59064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204695.60452: done with get_vars() 49116 1727204695.60484: done getting variables 49116 1727204695.60551: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:04:55 -0400 (0:00:00.095) 0:00:18.630 ***** 49116 1727204695.60591: entering _queue_task() for managed-node3/service 49116 1727204695.60980: worker is 1 (out of 1 available) 49116 1727204695.60994: exiting _queue_task() for managed-node3/service 49116 1727204695.61009: done queuing things up, now waiting for results queue to drain 49116 1727204695.61011: waiting for pending results... 49116 1727204695.61338: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 49116 1727204695.61472: in run() - task 127b8e07-fff9-02f7-957b-000000000022 49116 1727204695.61552: variable 'ansible_search_path' from source: unknown 49116 1727204695.61558: variable 'ansible_search_path' from source: unknown 49116 1727204695.61561: calling self._execute() 49116 1727204695.61633: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204695.61642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204695.61659: variable 'omit' from source: magic vars 49116 1727204695.62099: variable 'ansible_distribution_major_version' from source: facts 49116 1727204695.62106: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204695.62252: variable 'network_provider' from source: set_fact 49116 1727204695.62256: variable 'network_state' from source: role '' defaults 49116 1727204695.62264: Evaluated conditional (network_provider == "nm" or network_state != {}): True 49116 1727204695.62272: variable 'omit' from source: magic vars 49116 1727204695.62316: variable 'omit' from source: magic vars 49116 1727204695.62343: variable 'network_service_name' from source: role '' defaults 49116 1727204695.62406: variable 'network_service_name' from source: role '' defaults 49116 1727204695.62497: variable '__network_provider_setup' from source: role '' defaults 49116 1727204695.62501: variable '__network_service_name_default_nm' from source: role '' defaults 49116 1727204695.62555: variable '__network_service_name_default_nm' from source: role '' defaults 49116 1727204695.62565: variable '__network_packages_default_nm' from source: role '' defaults 49116 1727204695.62614: variable '__network_packages_default_nm' from source: role '' defaults 49116 1727204695.62792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204695.64705: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204695.64776: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204695.64813: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204695.64852: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204695.64875: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204695.65199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204695.65203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204695.65206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204695.65209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204695.65212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204695.65215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204695.65218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204695.65221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204695.65223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204695.65225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204695.65604: variable '__network_packages_default_gobject_packages' from source: role '' defaults 49116 1727204695.65614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204695.65643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204695.65670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204695.65711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204695.65726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204695.65825: variable 'ansible_python' from source: facts 49116 1727204695.65853: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 49116 1727204695.65946: variable '__network_wpa_supplicant_required' from source: role '' defaults 49116 1727204695.66049: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 49116 1727204695.66178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204695.66209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204695.66230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204695.66276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204695.66295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204695.66334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204695.66358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204695.66380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204695.66407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204695.66419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204695.66529: variable 'network_connections' from source: task vars 49116 1727204695.66539: variable 'interface' from source: play vars 49116 1727204695.66600: variable 'interface' from source: play vars 49116 1727204695.66613: variable 'vlan_interface' from source: play vars 49116 1727204695.66670: variable 'vlan_interface' from source: play vars 49116 1727204695.66681: variable 'interface' from source: play vars 49116 1727204695.66734: variable 'interface' from source: play vars 49116 1727204695.66821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49116 1727204695.66980: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49116 1727204695.67021: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49116 1727204695.67057: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49116 1727204695.67089: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49116 1727204695.67141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49116 1727204695.67166: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49116 1727204695.67190: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204695.67216: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49116 1727204695.67261: variable '__network_wireless_connections_defined' from source: role '' defaults 49116 1727204695.67466: variable 'network_connections' from source: task vars 49116 1727204695.67473: variable 'interface' from source: play vars 49116 1727204695.67528: variable 'interface' from source: play vars 49116 1727204695.67542: variable 'vlan_interface' from source: play vars 49116 1727204695.67601: variable 'vlan_interface' from source: play vars 49116 1727204695.67610: variable 'interface' from source: play vars 49116 1727204695.67670: variable 'interface' from source: play vars 49116 1727204695.67711: variable '__network_packages_default_wireless' from source: role '' defaults 49116 1727204695.67775: variable '__network_wireless_connections_defined' from source: role '' defaults 49116 1727204695.67983: variable 'network_connections' from source: task vars 49116 1727204695.67986: variable 'interface' from source: play vars 49116 1727204695.68043: variable 'interface' from source: play vars 49116 1727204695.68051: variable 'vlan_interface' from source: play vars 49116 1727204695.68106: variable 'vlan_interface' from source: play vars 49116 1727204695.68115: variable 'interface' from source: play vars 49116 1727204695.68200: variable 'interface' from source: play vars 49116 1727204695.68209: variable '__network_packages_default_team' from source: role '' defaults 49116 1727204695.68364: variable '__network_team_connections_defined' from source: role '' defaults 49116 1727204695.68671: variable 'network_connections' from source: task vars 49116 1727204695.68675: variable 'interface' from source: play vars 49116 1727204695.68677: variable 'interface' from source: play vars 49116 1727204695.68703: variable 'vlan_interface' from source: play vars 49116 1727204695.68761: variable 'vlan_interface' from source: play vars 49116 1727204695.68768: variable 'interface' from source: play vars 49116 1727204695.68842: variable 'interface' from source: play vars 49116 1727204695.68971: variable '__network_service_name_default_initscripts' from source: role '' defaults 49116 1727204695.68975: variable '__network_service_name_default_initscripts' from source: role '' defaults 49116 1727204695.68982: variable '__network_packages_default_initscripts' from source: role '' defaults 49116 1727204695.69046: variable '__network_packages_default_initscripts' from source: role '' defaults 49116 1727204695.69285: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 49116 1727204695.69731: variable 'network_connections' from source: task vars 49116 1727204695.69735: variable 'interface' from source: play vars 49116 1727204695.69791: variable 'interface' from source: play vars 49116 1727204695.69799: variable 'vlan_interface' from source: play vars 49116 1727204695.69851: variable 'vlan_interface' from source: play vars 49116 1727204695.69857: variable 'interface' from source: play vars 49116 1727204695.69904: variable 'interface' from source: play vars 49116 1727204695.69914: variable 'ansible_distribution' from source: facts 49116 1727204695.69917: variable '__network_rh_distros' from source: role '' defaults 49116 1727204695.69923: variable 'ansible_distribution_major_version' from source: facts 49116 1727204695.69947: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 49116 1727204695.70077: variable 'ansible_distribution' from source: facts 49116 1727204695.70081: variable '__network_rh_distros' from source: role '' defaults 49116 1727204695.70086: variable 'ansible_distribution_major_version' from source: facts 49116 1727204695.70094: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 49116 1727204695.70220: variable 'ansible_distribution' from source: facts 49116 1727204695.70223: variable '__network_rh_distros' from source: role '' defaults 49116 1727204695.70228: variable 'ansible_distribution_major_version' from source: facts 49116 1727204695.70259: variable 'network_provider' from source: set_fact 49116 1727204695.70280: variable 'omit' from source: magic vars 49116 1727204695.70305: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204695.70332: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204695.70351: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204695.70367: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204695.70377: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204695.70402: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204695.70405: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204695.70408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204695.70487: Set connection var ansible_connection to ssh 49116 1727204695.70497: Set connection var ansible_timeout to 10 49116 1727204695.70505: Set connection var ansible_shell_executable to /bin/sh 49116 1727204695.70510: Set connection var ansible_pipelining to False 49116 1727204695.70513: Set connection var ansible_shell_type to sh 49116 1727204695.70518: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204695.70543: variable 'ansible_shell_executable' from source: unknown 49116 1727204695.70546: variable 'ansible_connection' from source: unknown 49116 1727204695.70548: variable 'ansible_module_compression' from source: unknown 49116 1727204695.70551: variable 'ansible_shell_type' from source: unknown 49116 1727204695.70553: variable 'ansible_shell_executable' from source: unknown 49116 1727204695.70555: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204695.70559: variable 'ansible_pipelining' from source: unknown 49116 1727204695.70561: variable 'ansible_timeout' from source: unknown 49116 1727204695.70569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204695.70650: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204695.70660: variable 'omit' from source: magic vars 49116 1727204695.70670: starting attempt loop 49116 1727204695.70673: running the handler 49116 1727204695.70741: variable 'ansible_facts' from source: unknown 49116 1727204695.71389: _low_level_execute_command(): starting 49116 1727204695.71394: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204695.72014: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204695.72021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204695.72024: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204695.72073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204695.72077: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204695.72084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204695.72161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204695.74020: stdout chunk (state=3): >>>/root <<< 49116 1727204695.74118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204695.74187: stderr chunk (state=3): >>><<< 49116 1727204695.74190: stdout chunk (state=3): >>><<< 49116 1727204695.74211: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204695.74222: _low_level_execute_command(): starting 49116 1727204695.74229: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204695.7421162-50305-25444657082696 `" && echo ansible-tmp-1727204695.7421162-50305-25444657082696="` echo /root/.ansible/tmp/ansible-tmp-1727204695.7421162-50305-25444657082696 `" ) && sleep 0' 49116 1727204695.74741: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204695.74745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204695.74748: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204695.74751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204695.74808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204695.74812: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204695.74816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204695.74894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204695.77075: stdout chunk (state=3): >>>ansible-tmp-1727204695.7421162-50305-25444657082696=/root/.ansible/tmp/ansible-tmp-1727204695.7421162-50305-25444657082696 <<< 49116 1727204695.77185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204695.77250: stderr chunk (state=3): >>><<< 49116 1727204695.77253: stdout chunk (state=3): >>><<< 49116 1727204695.77269: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204695.7421162-50305-25444657082696=/root/.ansible/tmp/ansible-tmp-1727204695.7421162-50305-25444657082696 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204695.77302: variable 'ansible_module_compression' from source: unknown 49116 1727204695.77363: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 49116 1727204695.77370: ANSIBALLZ: Acquiring lock 49116 1727204695.77373: ANSIBALLZ: Lock acquired: 139720119767104 49116 1727204695.77375: ANSIBALLZ: Creating module 49116 1727204696.11537: ANSIBALLZ: Writing module into payload 49116 1727204696.11973: ANSIBALLZ: Writing module 49116 1727204696.11978: ANSIBALLZ: Renaming module 49116 1727204696.11981: ANSIBALLZ: Done creating module 49116 1727204696.11985: variable 'ansible_facts' from source: unknown 49116 1727204696.12074: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204695.7421162-50305-25444657082696/AnsiballZ_systemd.py 49116 1727204696.12360: Sending initial data 49116 1727204696.12372: Sent initial data (155 bytes) 49116 1727204696.12958: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204696.12981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204696.13091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204696.13119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204696.13141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204696.13163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204696.13279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204696.15103: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204696.15174: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204696.15258: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmp_i2hmfbm /root/.ansible/tmp/ansible-tmp-1727204695.7421162-50305-25444657082696/AnsiballZ_systemd.py <<< 49116 1727204696.15262: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204695.7421162-50305-25444657082696/AnsiballZ_systemd.py" <<< 49116 1727204696.15338: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmp_i2hmfbm" to remote "/root/.ansible/tmp/ansible-tmp-1727204695.7421162-50305-25444657082696/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204695.7421162-50305-25444657082696/AnsiballZ_systemd.py" <<< 49116 1727204696.17309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204696.17356: stderr chunk (state=3): >>><<< 49116 1727204696.17372: stdout chunk (state=3): >>><<< 49116 1727204696.17475: done transferring module to remote 49116 1727204696.17478: _low_level_execute_command(): starting 49116 1727204696.17481: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204695.7421162-50305-25444657082696/ /root/.ansible/tmp/ansible-tmp-1727204695.7421162-50305-25444657082696/AnsiballZ_systemd.py && sleep 0' 49116 1727204696.18390: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204696.18409: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204696.18627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204696.18656: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204696.18708: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204696.18987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204696.21061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204696.21084: stdout chunk (state=3): >>><<< 49116 1727204696.21099: stderr chunk (state=3): >>><<< 49116 1727204696.21121: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204696.21132: _low_level_execute_command(): starting 49116 1727204696.21147: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204695.7421162-50305-25444657082696/AnsiballZ_systemd.py && sleep 0' 49116 1727204696.22462: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204696.22492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204696.22606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204696.22654: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204696.22673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204696.22792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204696.22936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204696.56554: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "75711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 15:04:14 EDT", "ExecMainStartTimestampMonotonic": "992436115", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "75711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Tue 2024-09-24 15:04:14 EDT] ; stop_time=[n/a] ; pid=75711 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Tue 2024-09-24 15:04:14 EDT] ; stop_time=[n/a] ; pid=75711 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "7558", "MemoryCurrent": "3723264", "MemoryPeak": "4870144", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3504758784", "CPUUsageNSec": "135470000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "Limi<<< 49116 1727204696.56585: stdout chunk (state=3): >>>tSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target cloud-init.service network.target multi-user.target NetworkManager-wait-online.service", "After": "systemd-journald.socket dbus.socket cloud-init-local.service sysinit.target basic.target system.slice network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", <<< 49116 1727204696.56601: stdout chunk (state=3): >>>"Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:04:15 EDT", "StateChangeTimestampMonotonic": "992533498", "InactiveExitTimestamp": "Tue 2024-09-24 15:04:14 EDT", "InactiveExitTimestampMonotonic": "992436398", "ActiveEnterTimestamp": "Tue 2024-09-24 15:04:15 EDT", "ActiveEnterTimestampMonotonic": "992533498", "ActiveExitTimestamp": "Tue 2024-09-24 15:04:14 EDT", "ActiveExitTimestampMonotonic": "992357833", "InactiveEnterTimestamp": "Tue 2024-09-24 15:04:14 EDT", "InactiveEnterTimestampMonotonic": "992431355", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 15:04:14 EDT", "ConditionTimestampMonotonic": "992432565", "AssertTimestamp": "Tue 2024-09-24 15:04:14 EDT", "AssertTimestampMonotonic": "992432569", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "149c166e8026437d99b665831d791274", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 49116 1727204696.58947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204696.58952: stdout chunk (state=3): >>><<< 49116 1727204696.58954: stderr chunk (state=3): >>><<< 49116 1727204696.58978: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "75711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 15:04:14 EDT", "ExecMainStartTimestampMonotonic": "992436115", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "75711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Tue 2024-09-24 15:04:14 EDT] ; stop_time=[n/a] ; pid=75711 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Tue 2024-09-24 15:04:14 EDT] ; stop_time=[n/a] ; pid=75711 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "7558", "MemoryCurrent": "3723264", "MemoryPeak": "4870144", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3504758784", "CPUUsageNSec": "135470000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target cloud-init.service network.target multi-user.target NetworkManager-wait-online.service", "After": "systemd-journald.socket dbus.socket cloud-init-local.service sysinit.target basic.target system.slice network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:04:15 EDT", "StateChangeTimestampMonotonic": "992533498", "InactiveExitTimestamp": "Tue 2024-09-24 15:04:14 EDT", "InactiveExitTimestampMonotonic": "992436398", "ActiveEnterTimestamp": "Tue 2024-09-24 15:04:15 EDT", "ActiveEnterTimestampMonotonic": "992533498", "ActiveExitTimestamp": "Tue 2024-09-24 15:04:14 EDT", "ActiveExitTimestampMonotonic": "992357833", "InactiveEnterTimestamp": "Tue 2024-09-24 15:04:14 EDT", "InactiveEnterTimestampMonotonic": "992431355", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 15:04:14 EDT", "ConditionTimestampMonotonic": "992432565", "AssertTimestamp": "Tue 2024-09-24 15:04:14 EDT", "AssertTimestampMonotonic": "992432569", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "149c166e8026437d99b665831d791274", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204696.59322: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204695.7421162-50305-25444657082696/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204696.59327: _low_level_execute_command(): starting 49116 1727204696.59330: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204695.7421162-50305-25444657082696/ > /dev/null 2>&1 && sleep 0' 49116 1727204696.60021: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204696.60083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204696.60179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204696.60327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204696.62457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204696.62461: stdout chunk (state=3): >>><<< 49116 1727204696.62464: stderr chunk (state=3): >>><<< 49116 1727204696.62485: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204696.62528: handler run complete 49116 1727204696.62608: attempt loop complete, returning result 49116 1727204696.62649: _execute() done 49116 1727204696.62652: dumping result to json 49116 1727204696.62655: done dumping result, returning 49116 1727204696.62673: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-02f7-957b-000000000022] 49116 1727204696.62684: sending task result for task 127b8e07-fff9-02f7-957b-000000000022 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 49116 1727204696.63337: no more pending results, returning what we have 49116 1727204696.63341: results queue empty 49116 1727204696.63343: checking for any_errors_fatal 49116 1727204696.63350: done checking for any_errors_fatal 49116 1727204696.63351: checking for max_fail_percentage 49116 1727204696.63352: done checking for max_fail_percentage 49116 1727204696.63353: checking to see if all hosts have failed and the running result is not ok 49116 1727204696.63354: done checking to see if all hosts have failed 49116 1727204696.63355: getting the remaining hosts for this loop 49116 1727204696.63357: done getting the remaining hosts for this loop 49116 1727204696.63362: getting the next task for host managed-node3 49116 1727204696.63372: done getting next task for host managed-node3 49116 1727204696.63377: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 49116 1727204696.63380: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204696.63394: getting variables 49116 1727204696.63396: in VariableManager get_vars() 49116 1727204696.63439: Calling all_inventory to load vars for managed-node3 49116 1727204696.63443: Calling groups_inventory to load vars for managed-node3 49116 1727204696.63445: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204696.63457: Calling all_plugins_play to load vars for managed-node3 49116 1727204696.63460: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204696.63463: Calling groups_plugins_play to load vars for managed-node3 49116 1727204696.63545: done sending task result for task 127b8e07-fff9-02f7-957b-000000000022 49116 1727204696.63549: WORKER PROCESS EXITING 49116 1727204696.65159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204696.67331: done with get_vars() 49116 1727204696.67371: done getting variables 49116 1727204696.67422: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:04:56 -0400 (0:00:01.068) 0:00:19.699 ***** 49116 1727204696.67455: entering _queue_task() for managed-node3/service 49116 1727204696.67753: worker is 1 (out of 1 available) 49116 1727204696.67773: exiting _queue_task() for managed-node3/service 49116 1727204696.67786: done queuing things up, now waiting for results queue to drain 49116 1727204696.67788: waiting for pending results... 49116 1727204696.67986: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 49116 1727204696.68174: in run() - task 127b8e07-fff9-02f7-957b-000000000023 49116 1727204696.68178: variable 'ansible_search_path' from source: unknown 49116 1727204696.68181: variable 'ansible_search_path' from source: unknown 49116 1727204696.68201: calling self._execute() 49116 1727204696.68320: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204696.68472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204696.68475: variable 'omit' from source: magic vars 49116 1727204696.68810: variable 'ansible_distribution_major_version' from source: facts 49116 1727204696.68837: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204696.68990: variable 'network_provider' from source: set_fact 49116 1727204696.69003: Evaluated conditional (network_provider == "nm"): True 49116 1727204696.69125: variable '__network_wpa_supplicant_required' from source: role '' defaults 49116 1727204696.69247: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 49116 1727204696.69421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204696.71467: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204696.71527: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204696.71561: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204696.71591: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204696.71611: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204696.71689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204696.71711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204696.71730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204696.71765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204696.71779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204696.71818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204696.71835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204696.71859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204696.71899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204696.71910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204696.71944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204696.71962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204696.71983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204696.72013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204696.72025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204696.72135: variable 'network_connections' from source: task vars 49116 1727204696.72150: variable 'interface' from source: play vars 49116 1727204696.72217: variable 'interface' from source: play vars 49116 1727204696.72229: variable 'vlan_interface' from source: play vars 49116 1727204696.72281: variable 'vlan_interface' from source: play vars 49116 1727204696.72287: variable 'interface' from source: play vars 49116 1727204696.72336: variable 'interface' from source: play vars 49116 1727204696.72403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49116 1727204696.72570: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49116 1727204696.72602: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49116 1727204696.72628: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49116 1727204696.72658: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49116 1727204696.72699: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49116 1727204696.72716: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49116 1727204696.72737: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204696.72761: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49116 1727204696.72808: variable '__network_wireless_connections_defined' from source: role '' defaults 49116 1727204696.73172: variable 'network_connections' from source: task vars 49116 1727204696.73176: variable 'interface' from source: play vars 49116 1727204696.73179: variable 'interface' from source: play vars 49116 1727204696.73181: variable 'vlan_interface' from source: play vars 49116 1727204696.73184: variable 'vlan_interface' from source: play vars 49116 1727204696.73186: variable 'interface' from source: play vars 49116 1727204696.73236: variable 'interface' from source: play vars 49116 1727204696.73291: Evaluated conditional (__network_wpa_supplicant_required): False 49116 1727204696.73295: when evaluation is False, skipping this task 49116 1727204696.73298: _execute() done 49116 1727204696.73300: dumping result to json 49116 1727204696.73302: done dumping result, returning 49116 1727204696.73313: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-02f7-957b-000000000023] 49116 1727204696.73319: sending task result for task 127b8e07-fff9-02f7-957b-000000000023 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 49116 1727204696.73579: no more pending results, returning what we have 49116 1727204696.73583: results queue empty 49116 1727204696.73584: checking for any_errors_fatal 49116 1727204696.73604: done checking for any_errors_fatal 49116 1727204696.73605: checking for max_fail_percentage 49116 1727204696.73607: done checking for max_fail_percentage 49116 1727204696.73608: checking to see if all hosts have failed and the running result is not ok 49116 1727204696.73609: done checking to see if all hosts have failed 49116 1727204696.73609: getting the remaining hosts for this loop 49116 1727204696.73610: done getting the remaining hosts for this loop 49116 1727204696.73614: getting the next task for host managed-node3 49116 1727204696.73621: done getting next task for host managed-node3 49116 1727204696.73625: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 49116 1727204696.73628: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204696.73644: getting variables 49116 1727204696.73645: in VariableManager get_vars() 49116 1727204696.73693: Calling all_inventory to load vars for managed-node3 49116 1727204696.73697: Calling groups_inventory to load vars for managed-node3 49116 1727204696.73699: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204696.73709: Calling all_plugins_play to load vars for managed-node3 49116 1727204696.73711: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204696.73714: Calling groups_plugins_play to load vars for managed-node3 49116 1727204696.74245: done sending task result for task 127b8e07-fff9-02f7-957b-000000000023 49116 1727204696.74248: WORKER PROCESS EXITING 49116 1727204696.75460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204696.76696: done with get_vars() 49116 1727204696.76727: done getting variables 49116 1727204696.76784: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:04:56 -0400 (0:00:00.093) 0:00:19.793 ***** 49116 1727204696.76811: entering _queue_task() for managed-node3/service 49116 1727204696.77097: worker is 1 (out of 1 available) 49116 1727204696.77112: exiting _queue_task() for managed-node3/service 49116 1727204696.77126: done queuing things up, now waiting for results queue to drain 49116 1727204696.77127: waiting for pending results... 49116 1727204696.77329: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 49116 1727204696.77427: in run() - task 127b8e07-fff9-02f7-957b-000000000024 49116 1727204696.77443: variable 'ansible_search_path' from source: unknown 49116 1727204696.77446: variable 'ansible_search_path' from source: unknown 49116 1727204696.77484: calling self._execute() 49116 1727204696.77569: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204696.77575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204696.77588: variable 'omit' from source: magic vars 49116 1727204696.77896: variable 'ansible_distribution_major_version' from source: facts 49116 1727204696.77912: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204696.78001: variable 'network_provider' from source: set_fact 49116 1727204696.78005: Evaluated conditional (network_provider == "initscripts"): False 49116 1727204696.78008: when evaluation is False, skipping this task 49116 1727204696.78011: _execute() done 49116 1727204696.78016: dumping result to json 49116 1727204696.78019: done dumping result, returning 49116 1727204696.78032: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-02f7-957b-000000000024] 49116 1727204696.78035: sending task result for task 127b8e07-fff9-02f7-957b-000000000024 49116 1727204696.78136: done sending task result for task 127b8e07-fff9-02f7-957b-000000000024 49116 1727204696.78140: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 49116 1727204696.78196: no more pending results, returning what we have 49116 1727204696.78200: results queue empty 49116 1727204696.78201: checking for any_errors_fatal 49116 1727204696.78213: done checking for any_errors_fatal 49116 1727204696.78214: checking for max_fail_percentage 49116 1727204696.78216: done checking for max_fail_percentage 49116 1727204696.78218: checking to see if all hosts have failed and the running result is not ok 49116 1727204696.78218: done checking to see if all hosts have failed 49116 1727204696.78219: getting the remaining hosts for this loop 49116 1727204696.78220: done getting the remaining hosts for this loop 49116 1727204696.78225: getting the next task for host managed-node3 49116 1727204696.78233: done getting next task for host managed-node3 49116 1727204696.78238: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 49116 1727204696.78242: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204696.78268: getting variables 49116 1727204696.78270: in VariableManager get_vars() 49116 1727204696.78311: Calling all_inventory to load vars for managed-node3 49116 1727204696.78314: Calling groups_inventory to load vars for managed-node3 49116 1727204696.78316: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204696.78326: Calling all_plugins_play to load vars for managed-node3 49116 1727204696.78328: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204696.78331: Calling groups_plugins_play to load vars for managed-node3 49116 1727204696.79485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204696.80701: done with get_vars() 49116 1727204696.80734: done getting variables 49116 1727204696.80790: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:04:56 -0400 (0:00:00.040) 0:00:19.833 ***** 49116 1727204696.80823: entering _queue_task() for managed-node3/copy 49116 1727204696.81123: worker is 1 (out of 1 available) 49116 1727204696.81137: exiting _queue_task() for managed-node3/copy 49116 1727204696.81152: done queuing things up, now waiting for results queue to drain 49116 1727204696.81153: waiting for pending results... 49116 1727204696.81370: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 49116 1727204696.81476: in run() - task 127b8e07-fff9-02f7-957b-000000000025 49116 1727204696.81490: variable 'ansible_search_path' from source: unknown 49116 1727204696.81494: variable 'ansible_search_path' from source: unknown 49116 1727204696.81529: calling self._execute() 49116 1727204696.81619: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204696.81624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204696.81628: variable 'omit' from source: magic vars 49116 1727204696.81939: variable 'ansible_distribution_major_version' from source: facts 49116 1727204696.82062: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204696.82070: variable 'network_provider' from source: set_fact 49116 1727204696.82073: Evaluated conditional (network_provider == "initscripts"): False 49116 1727204696.82076: when evaluation is False, skipping this task 49116 1727204696.82079: _execute() done 49116 1727204696.82081: dumping result to json 49116 1727204696.82083: done dumping result, returning 49116 1727204696.82087: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-02f7-957b-000000000025] 49116 1727204696.82089: sending task result for task 127b8e07-fff9-02f7-957b-000000000025 49116 1727204696.82177: done sending task result for task 127b8e07-fff9-02f7-957b-000000000025 49116 1727204696.82179: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 49116 1727204696.82230: no more pending results, returning what we have 49116 1727204696.82234: results queue empty 49116 1727204696.82236: checking for any_errors_fatal 49116 1727204696.82244: done checking for any_errors_fatal 49116 1727204696.82245: checking for max_fail_percentage 49116 1727204696.82247: done checking for max_fail_percentage 49116 1727204696.82249: checking to see if all hosts have failed and the running result is not ok 49116 1727204696.82249: done checking to see if all hosts have failed 49116 1727204696.82250: getting the remaining hosts for this loop 49116 1727204696.82252: done getting the remaining hosts for this loop 49116 1727204696.82256: getting the next task for host managed-node3 49116 1727204696.82264: done getting next task for host managed-node3 49116 1727204696.82271: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 49116 1727204696.82274: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204696.82293: getting variables 49116 1727204696.82294: in VariableManager get_vars() 49116 1727204696.82337: Calling all_inventory to load vars for managed-node3 49116 1727204696.82339: Calling groups_inventory to load vars for managed-node3 49116 1727204696.82342: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204696.82352: Calling all_plugins_play to load vars for managed-node3 49116 1727204696.82354: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204696.82357: Calling groups_plugins_play to load vars for managed-node3 49116 1727204696.83407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204696.84637: done with get_vars() 49116 1727204696.84674: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:04:56 -0400 (0:00:00.039) 0:00:19.872 ***** 49116 1727204696.84745: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 49116 1727204696.84746: Creating lock for fedora.linux_system_roles.network_connections 49116 1727204696.85058: worker is 1 (out of 1 available) 49116 1727204696.85073: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 49116 1727204696.85088: done queuing things up, now waiting for results queue to drain 49116 1727204696.85090: waiting for pending results... 49116 1727204696.85288: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 49116 1727204696.85388: in run() - task 127b8e07-fff9-02f7-957b-000000000026 49116 1727204696.85402: variable 'ansible_search_path' from source: unknown 49116 1727204696.85406: variable 'ansible_search_path' from source: unknown 49116 1727204696.85443: calling self._execute() 49116 1727204696.85522: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204696.85530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204696.85544: variable 'omit' from source: magic vars 49116 1727204696.85855: variable 'ansible_distribution_major_version' from source: facts 49116 1727204696.85867: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204696.85876: variable 'omit' from source: magic vars 49116 1727204696.85922: variable 'omit' from source: magic vars 49116 1727204696.86063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204696.88047: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204696.88099: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204696.88126: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204696.88159: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204696.88182: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204696.88246: variable 'network_provider' from source: set_fact 49116 1727204696.88360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204696.88384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204696.88403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204696.88433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204696.88446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204696.88509: variable 'omit' from source: magic vars 49116 1727204696.88602: variable 'omit' from source: magic vars 49116 1727204696.88680: variable 'network_connections' from source: task vars 49116 1727204696.88692: variable 'interface' from source: play vars 49116 1727204696.88746: variable 'interface' from source: play vars 49116 1727204696.88755: variable 'vlan_interface' from source: play vars 49116 1727204696.88801: variable 'vlan_interface' from source: play vars 49116 1727204696.88812: variable 'interface' from source: play vars 49116 1727204696.88857: variable 'interface' from source: play vars 49116 1727204696.89020: variable 'omit' from source: magic vars 49116 1727204696.89029: variable '__lsr_ansible_managed' from source: task vars 49116 1727204696.89080: variable '__lsr_ansible_managed' from source: task vars 49116 1727204696.89314: Loaded config def from plugin (lookup/template) 49116 1727204696.89319: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 49116 1727204696.89344: File lookup term: get_ansible_managed.j2 49116 1727204696.89347: variable 'ansible_search_path' from source: unknown 49116 1727204696.89351: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 49116 1727204696.89371: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 49116 1727204696.89389: variable 'ansible_search_path' from source: unknown 49116 1727204696.94475: variable 'ansible_managed' from source: unknown 49116 1727204696.94621: variable 'omit' from source: magic vars 49116 1727204696.94670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204696.94709: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204696.94739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204696.94765: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204696.94785: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204696.94822: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204696.94831: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204696.94843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204696.94948: Set connection var ansible_connection to ssh 49116 1727204696.95070: Set connection var ansible_timeout to 10 49116 1727204696.95074: Set connection var ansible_shell_executable to /bin/sh 49116 1727204696.95076: Set connection var ansible_pipelining to False 49116 1727204696.95078: Set connection var ansible_shell_type to sh 49116 1727204696.95080: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204696.95083: variable 'ansible_shell_executable' from source: unknown 49116 1727204696.95085: variable 'ansible_connection' from source: unknown 49116 1727204696.95087: variable 'ansible_module_compression' from source: unknown 49116 1727204696.95089: variable 'ansible_shell_type' from source: unknown 49116 1727204696.95091: variable 'ansible_shell_executable' from source: unknown 49116 1727204696.95093: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204696.95095: variable 'ansible_pipelining' from source: unknown 49116 1727204696.95097: variable 'ansible_timeout' from source: unknown 49116 1727204696.95099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204696.95244: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49116 1727204696.95262: variable 'omit' from source: magic vars 49116 1727204696.95279: starting attempt loop 49116 1727204696.95287: running the handler 49116 1727204696.95307: _low_level_execute_command(): starting 49116 1727204696.95319: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204696.96118: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204696.96148: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204696.96167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204696.96190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204696.96209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204696.96223: stderr chunk (state=3): >>>debug2: match not found <<< 49116 1727204696.96241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204696.96344: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204696.96369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204696.96489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204696.98355: stdout chunk (state=3): >>>/root <<< 49116 1727204696.98525: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204696.98542: stdout chunk (state=3): >>><<< 49116 1727204696.98556: stderr chunk (state=3): >>><<< 49116 1727204696.98695: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204696.98699: _low_level_execute_command(): starting 49116 1727204696.98703: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204696.9859252-50478-142891165062786 `" && echo ansible-tmp-1727204696.9859252-50478-142891165062786="` echo /root/.ansible/tmp/ansible-tmp-1727204696.9859252-50478-142891165062786 `" ) && sleep 0' 49116 1727204696.99415: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204696.99425: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204696.99472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 49116 1727204696.99577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204696.99623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204696.99737: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204697.01907: stdout chunk (state=3): >>>ansible-tmp-1727204696.9859252-50478-142891165062786=/root/.ansible/tmp/ansible-tmp-1727204696.9859252-50478-142891165062786 <<< 49116 1727204697.02025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204697.02091: stderr chunk (state=3): >>><<< 49116 1727204697.02094: stdout chunk (state=3): >>><<< 49116 1727204697.02114: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204696.9859252-50478-142891165062786=/root/.ansible/tmp/ansible-tmp-1727204696.9859252-50478-142891165062786 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204697.02163: variable 'ansible_module_compression' from source: unknown 49116 1727204697.02206: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 49116 1727204697.02211: ANSIBALLZ: Acquiring lock 49116 1727204697.02214: ANSIBALLZ: Lock acquired: 139720113990224 49116 1727204697.02216: ANSIBALLZ: Creating module 49116 1727204697.22105: ANSIBALLZ: Writing module into payload 49116 1727204697.22345: ANSIBALLZ: Writing module 49116 1727204697.22373: ANSIBALLZ: Renaming module 49116 1727204697.22377: ANSIBALLZ: Done creating module 49116 1727204697.22403: variable 'ansible_facts' from source: unknown 49116 1727204697.22478: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204696.9859252-50478-142891165062786/AnsiballZ_network_connections.py 49116 1727204697.22607: Sending initial data 49116 1727204697.22611: Sent initial data (168 bytes) 49116 1727204697.23131: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204697.23137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204697.23140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204697.23198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204697.23201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204697.23294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204697.25153: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204697.25224: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204697.25293: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpyhjb0lyb /root/.ansible/tmp/ansible-tmp-1727204696.9859252-50478-142891165062786/AnsiballZ_network_connections.py <<< 49116 1727204697.25297: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204696.9859252-50478-142891165062786/AnsiballZ_network_connections.py" <<< 49116 1727204697.25357: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpyhjb0lyb" to remote "/root/.ansible/tmp/ansible-tmp-1727204696.9859252-50478-142891165062786/AnsiballZ_network_connections.py" <<< 49116 1727204697.25361: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204696.9859252-50478-142891165062786/AnsiballZ_network_connections.py" <<< 49116 1727204697.26217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204697.26301: stderr chunk (state=3): >>><<< 49116 1727204697.26305: stdout chunk (state=3): >>><<< 49116 1727204697.26327: done transferring module to remote 49116 1727204697.26338: _low_level_execute_command(): starting 49116 1727204697.26343: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204696.9859252-50478-142891165062786/ /root/.ansible/tmp/ansible-tmp-1727204696.9859252-50478-142891165062786/AnsiballZ_network_connections.py && sleep 0' 49116 1727204697.26868: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204697.26874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49116 1727204697.26877: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 49116 1727204697.26880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204697.26929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204697.26932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204697.26935: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204697.27013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204697.29056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204697.29124: stderr chunk (state=3): >>><<< 49116 1727204697.29128: stdout chunk (state=3): >>><<< 49116 1727204697.29142: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204697.29145: _low_level_execute_command(): starting 49116 1727204697.29151: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204696.9859252-50478-142891165062786/AnsiballZ_network_connections.py && sleep 0' 49116 1727204697.29674: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204697.29680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204697.29687: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204697.29729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204697.29735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204697.29742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204697.29821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204697.68415: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, f99c432c-37f8-451d-93e7-dceda575981b\n[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 04325438-476c-4a74-b269-34ef9c78263d\n[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, f99c432c-37f8-451d-93e7-dceda575981b (not-active)\n[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 04325438-476c-4a74-b269-34ef9c78263d (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 49116 1727204697.72188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204697.72247: stderr chunk (state=3): >>><<< 49116 1727204697.72253: stdout chunk (state=3): >>><<< 49116 1727204697.72270: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, f99c432c-37f8-451d-93e7-dceda575981b\n[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 04325438-476c-4a74-b269-34ef9c78263d\n[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, f99c432c-37f8-451d-93e7-dceda575981b (not-active)\n[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 04325438-476c-4a74-b269-34ef9c78263d (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204697.72318: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr101', 'type': 'ethernet', 'state': 'up', 'mtu': 1492, 'autoconnect': False, 'ip': {'dhcp4': False, 'auto6': False}}, {'name': 'lsr101.90', 'parent': 'lsr101', 'type': 'vlan', 'vlan_id': 90, 'mtu': 1280, 'state': 'up', 'autoconnect': False, 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204696.9859252-50478-142891165062786/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204697.72327: _low_level_execute_command(): starting 49116 1727204697.72335: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204696.9859252-50478-142891165062786/ > /dev/null 2>&1 && sleep 0' 49116 1727204697.72858: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204697.72862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204697.72866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204697.72917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204697.72926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204697.73017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204697.75211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204697.75275: stderr chunk (state=3): >>><<< 49116 1727204697.75281: stdout chunk (state=3): >>><<< 49116 1727204697.75297: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204697.75303: handler run complete 49116 1727204697.75344: attempt loop complete, returning result 49116 1727204697.75348: _execute() done 49116 1727204697.75351: dumping result to json 49116 1727204697.75355: done dumping result, returning 49116 1727204697.75366: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-02f7-957b-000000000026] 49116 1727204697.75372: sending task result for task 127b8e07-fff9-02f7-957b-000000000026 49116 1727204697.75501: done sending task result for task 127b8e07-fff9-02f7-957b-000000000026 49116 1727204697.75504: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1492, "name": "lsr101", "state": "up", "type": "ethernet" }, { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1280, "name": "lsr101.90", "parent": "lsr101", "state": "up", "type": "vlan", "vlan_id": 90 } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, f99c432c-37f8-451d-93e7-dceda575981b [006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 04325438-476c-4a74-b269-34ef9c78263d [007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, f99c432c-37f8-451d-93e7-dceda575981b (not-active) [008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 04325438-476c-4a74-b269-34ef9c78263d (not-active) 49116 1727204697.75652: no more pending results, returning what we have 49116 1727204697.75656: results queue empty 49116 1727204697.75657: checking for any_errors_fatal 49116 1727204697.75675: done checking for any_errors_fatal 49116 1727204697.75676: checking for max_fail_percentage 49116 1727204697.75678: done checking for max_fail_percentage 49116 1727204697.75679: checking to see if all hosts have failed and the running result is not ok 49116 1727204697.75680: done checking to see if all hosts have failed 49116 1727204697.75681: getting the remaining hosts for this loop 49116 1727204697.75682: done getting the remaining hosts for this loop 49116 1727204697.75687: getting the next task for host managed-node3 49116 1727204697.75694: done getting next task for host managed-node3 49116 1727204697.75698: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 49116 1727204697.75701: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204697.75714: getting variables 49116 1727204697.75715: in VariableManager get_vars() 49116 1727204697.75757: Calling all_inventory to load vars for managed-node3 49116 1727204697.75760: Calling groups_inventory to load vars for managed-node3 49116 1727204697.75762: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204697.75774: Calling all_plugins_play to load vars for managed-node3 49116 1727204697.75784: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204697.75788: Calling groups_plugins_play to load vars for managed-node3 49116 1727204697.77036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204697.78257: done with get_vars() 49116 1727204697.78290: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:04:57 -0400 (0:00:00.936) 0:00:20.808 ***** 49116 1727204697.78364: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 49116 1727204697.78367: Creating lock for fedora.linux_system_roles.network_state 49116 1727204697.78666: worker is 1 (out of 1 available) 49116 1727204697.78681: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 49116 1727204697.78695: done queuing things up, now waiting for results queue to drain 49116 1727204697.78697: waiting for pending results... 49116 1727204697.78901: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 49116 1727204697.79008: in run() - task 127b8e07-fff9-02f7-957b-000000000027 49116 1727204697.79022: variable 'ansible_search_path' from source: unknown 49116 1727204697.79027: variable 'ansible_search_path' from source: unknown 49116 1727204697.79064: calling self._execute() 49116 1727204697.79143: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204697.79157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204697.79168: variable 'omit' from source: magic vars 49116 1727204697.79500: variable 'ansible_distribution_major_version' from source: facts 49116 1727204697.79511: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204697.79611: variable 'network_state' from source: role '' defaults 49116 1727204697.79621: Evaluated conditional (network_state != {}): False 49116 1727204697.79624: when evaluation is False, skipping this task 49116 1727204697.79627: _execute() done 49116 1727204697.79630: dumping result to json 49116 1727204697.79633: done dumping result, returning 49116 1727204697.79644: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-02f7-957b-000000000027] 49116 1727204697.79648: sending task result for task 127b8e07-fff9-02f7-957b-000000000027 49116 1727204697.79748: done sending task result for task 127b8e07-fff9-02f7-957b-000000000027 49116 1727204697.79751: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 49116 1727204697.79813: no more pending results, returning what we have 49116 1727204697.79817: results queue empty 49116 1727204697.79818: checking for any_errors_fatal 49116 1727204697.79832: done checking for any_errors_fatal 49116 1727204697.79832: checking for max_fail_percentage 49116 1727204697.79835: done checking for max_fail_percentage 49116 1727204697.79836: checking to see if all hosts have failed and the running result is not ok 49116 1727204697.79837: done checking to see if all hosts have failed 49116 1727204697.79838: getting the remaining hosts for this loop 49116 1727204697.79840: done getting the remaining hosts for this loop 49116 1727204697.79845: getting the next task for host managed-node3 49116 1727204697.79852: done getting next task for host managed-node3 49116 1727204697.79856: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 49116 1727204697.79867: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204697.79885: getting variables 49116 1727204697.79887: in VariableManager get_vars() 49116 1727204697.79928: Calling all_inventory to load vars for managed-node3 49116 1727204697.79931: Calling groups_inventory to load vars for managed-node3 49116 1727204697.79933: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204697.79943: Calling all_plugins_play to load vars for managed-node3 49116 1727204697.79945: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204697.79948: Calling groups_plugins_play to load vars for managed-node3 49116 1727204697.81058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204697.82271: done with get_vars() 49116 1727204697.82303: done getting variables 49116 1727204697.82358: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:04:57 -0400 (0:00:00.040) 0:00:20.848 ***** 49116 1727204697.82391: entering _queue_task() for managed-node3/debug 49116 1727204697.82685: worker is 1 (out of 1 available) 49116 1727204697.82701: exiting _queue_task() for managed-node3/debug 49116 1727204697.82714: done queuing things up, now waiting for results queue to drain 49116 1727204697.82716: waiting for pending results... 49116 1727204697.82921: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 49116 1727204697.83031: in run() - task 127b8e07-fff9-02f7-957b-000000000028 49116 1727204697.83049: variable 'ansible_search_path' from source: unknown 49116 1727204697.83054: variable 'ansible_search_path' from source: unknown 49116 1727204697.83091: calling self._execute() 49116 1727204697.83178: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204697.83182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204697.83190: variable 'omit' from source: magic vars 49116 1727204697.83492: variable 'ansible_distribution_major_version' from source: facts 49116 1727204697.83503: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204697.83577: variable 'omit' from source: magic vars 49116 1727204697.83625: variable 'omit' from source: magic vars 49116 1727204697.83686: variable 'omit' from source: magic vars 49116 1727204697.83723: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204697.83759: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204697.83778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204697.83793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204697.83804: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204697.83829: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204697.83832: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204697.83837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204697.83915: Set connection var ansible_connection to ssh 49116 1727204697.83925: Set connection var ansible_timeout to 10 49116 1727204697.83935: Set connection var ansible_shell_executable to /bin/sh 49116 1727204697.83938: Set connection var ansible_pipelining to False 49116 1727204697.83941: Set connection var ansible_shell_type to sh 49116 1727204697.83946: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204697.83971: variable 'ansible_shell_executable' from source: unknown 49116 1727204697.83975: variable 'ansible_connection' from source: unknown 49116 1727204697.83980: variable 'ansible_module_compression' from source: unknown 49116 1727204697.83982: variable 'ansible_shell_type' from source: unknown 49116 1727204697.83985: variable 'ansible_shell_executable' from source: unknown 49116 1727204697.83987: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204697.83989: variable 'ansible_pipelining' from source: unknown 49116 1727204697.83991: variable 'ansible_timeout' from source: unknown 49116 1727204697.83994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204697.84112: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204697.84119: variable 'omit' from source: magic vars 49116 1727204697.84125: starting attempt loop 49116 1727204697.84128: running the handler 49116 1727204697.84241: variable '__network_connections_result' from source: set_fact 49116 1727204697.84296: handler run complete 49116 1727204697.84311: attempt loop complete, returning result 49116 1727204697.84314: _execute() done 49116 1727204697.84317: dumping result to json 49116 1727204697.84321: done dumping result, returning 49116 1727204697.84329: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-02f7-957b-000000000028] 49116 1727204697.84332: sending task result for task 127b8e07-fff9-02f7-957b-000000000028 49116 1727204697.84432: done sending task result for task 127b8e07-fff9-02f7-957b-000000000028 49116 1727204697.84437: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, f99c432c-37f8-451d-93e7-dceda575981b", "[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 04325438-476c-4a74-b269-34ef9c78263d", "[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, f99c432c-37f8-451d-93e7-dceda575981b (not-active)", "[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 04325438-476c-4a74-b269-34ef9c78263d (not-active)" ] } 49116 1727204697.84509: no more pending results, returning what we have 49116 1727204697.84512: results queue empty 49116 1727204697.84513: checking for any_errors_fatal 49116 1727204697.84521: done checking for any_errors_fatal 49116 1727204697.84522: checking for max_fail_percentage 49116 1727204697.84524: done checking for max_fail_percentage 49116 1727204697.84525: checking to see if all hosts have failed and the running result is not ok 49116 1727204697.84526: done checking to see if all hosts have failed 49116 1727204697.84527: getting the remaining hosts for this loop 49116 1727204697.84528: done getting the remaining hosts for this loop 49116 1727204697.84535: getting the next task for host managed-node3 49116 1727204697.84541: done getting next task for host managed-node3 49116 1727204697.84553: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 49116 1727204697.84556: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204697.84571: getting variables 49116 1727204697.84573: in VariableManager get_vars() 49116 1727204697.84613: Calling all_inventory to load vars for managed-node3 49116 1727204697.84616: Calling groups_inventory to load vars for managed-node3 49116 1727204697.84618: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204697.84627: Calling all_plugins_play to load vars for managed-node3 49116 1727204697.84630: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204697.84635: Calling groups_plugins_play to load vars for managed-node3 49116 1727204697.85935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204697.88302: done with get_vars() 49116 1727204697.88354: done getting variables 49116 1727204697.88440: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:04:57 -0400 (0:00:00.060) 0:00:20.909 ***** 49116 1727204697.88492: entering _queue_task() for managed-node3/debug 49116 1727204697.88850: worker is 1 (out of 1 available) 49116 1727204697.88868: exiting _queue_task() for managed-node3/debug 49116 1727204697.88882: done queuing things up, now waiting for results queue to drain 49116 1727204697.88883: waiting for pending results... 49116 1727204697.89084: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 49116 1727204697.89187: in run() - task 127b8e07-fff9-02f7-957b-000000000029 49116 1727204697.89203: variable 'ansible_search_path' from source: unknown 49116 1727204697.89207: variable 'ansible_search_path' from source: unknown 49116 1727204697.89241: calling self._execute() 49116 1727204697.89322: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204697.89326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204697.89339: variable 'omit' from source: magic vars 49116 1727204697.89643: variable 'ansible_distribution_major_version' from source: facts 49116 1727204697.89651: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204697.89656: variable 'omit' from source: magic vars 49116 1727204697.89705: variable 'omit' from source: magic vars 49116 1727204697.89732: variable 'omit' from source: magic vars 49116 1727204697.89772: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204697.89802: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204697.89818: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204697.89836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204697.89845: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204697.89873: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204697.89878: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204697.89881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204697.89952: Set connection var ansible_connection to ssh 49116 1727204697.89968: Set connection var ansible_timeout to 10 49116 1727204697.89974: Set connection var ansible_shell_executable to /bin/sh 49116 1727204697.89976: Set connection var ansible_pipelining to False 49116 1727204697.89979: Set connection var ansible_shell_type to sh 49116 1727204697.89986: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204697.90006: variable 'ansible_shell_executable' from source: unknown 49116 1727204697.90009: variable 'ansible_connection' from source: unknown 49116 1727204697.90012: variable 'ansible_module_compression' from source: unknown 49116 1727204697.90015: variable 'ansible_shell_type' from source: unknown 49116 1727204697.90018: variable 'ansible_shell_executable' from source: unknown 49116 1727204697.90021: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204697.90023: variable 'ansible_pipelining' from source: unknown 49116 1727204697.90026: variable 'ansible_timeout' from source: unknown 49116 1727204697.90031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204697.90146: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204697.90156: variable 'omit' from source: magic vars 49116 1727204697.90161: starting attempt loop 49116 1727204697.90164: running the handler 49116 1727204697.90212: variable '__network_connections_result' from source: set_fact 49116 1727204697.90273: variable '__network_connections_result' from source: set_fact 49116 1727204697.90395: handler run complete 49116 1727204697.90423: attempt loop complete, returning result 49116 1727204697.90427: _execute() done 49116 1727204697.90430: dumping result to json 49116 1727204697.90435: done dumping result, returning 49116 1727204697.90442: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-02f7-957b-000000000029] 49116 1727204697.90446: sending task result for task 127b8e07-fff9-02f7-957b-000000000029 49116 1727204697.90553: done sending task result for task 127b8e07-fff9-02f7-957b-000000000029 49116 1727204697.90556: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1492, "name": "lsr101", "state": "up", "type": "ethernet" }, { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1280, "name": "lsr101.90", "parent": "lsr101", "state": "up", "type": "vlan", "vlan_id": 90 } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, f99c432c-37f8-451d-93e7-dceda575981b\n[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 04325438-476c-4a74-b269-34ef9c78263d\n[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, f99c432c-37f8-451d-93e7-dceda575981b (not-active)\n[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 04325438-476c-4a74-b269-34ef9c78263d (not-active)\n", "stderr_lines": [ "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, f99c432c-37f8-451d-93e7-dceda575981b", "[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 04325438-476c-4a74-b269-34ef9c78263d", "[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, f99c432c-37f8-451d-93e7-dceda575981b (not-active)", "[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 04325438-476c-4a74-b269-34ef9c78263d (not-active)" ] } } 49116 1727204697.90688: no more pending results, returning what we have 49116 1727204697.90691: results queue empty 49116 1727204697.90692: checking for any_errors_fatal 49116 1727204697.90699: done checking for any_errors_fatal 49116 1727204697.90699: checking for max_fail_percentage 49116 1727204697.90702: done checking for max_fail_percentage 49116 1727204697.90703: checking to see if all hosts have failed and the running result is not ok 49116 1727204697.90703: done checking to see if all hosts have failed 49116 1727204697.90704: getting the remaining hosts for this loop 49116 1727204697.90705: done getting the remaining hosts for this loop 49116 1727204697.90710: getting the next task for host managed-node3 49116 1727204697.90716: done getting next task for host managed-node3 49116 1727204697.90726: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 49116 1727204697.90729: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204697.90742: getting variables 49116 1727204697.90744: in VariableManager get_vars() 49116 1727204697.90791: Calling all_inventory to load vars for managed-node3 49116 1727204697.90794: Calling groups_inventory to load vars for managed-node3 49116 1727204697.90796: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204697.90807: Calling all_plugins_play to load vars for managed-node3 49116 1727204697.90809: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204697.90812: Calling groups_plugins_play to load vars for managed-node3 49116 1727204697.92709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204697.94983: done with get_vars() 49116 1727204697.95025: done getting variables 49116 1727204697.95095: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:04:57 -0400 (0:00:00.066) 0:00:20.976 ***** 49116 1727204697.95138: entering _queue_task() for managed-node3/debug 49116 1727204697.95539: worker is 1 (out of 1 available) 49116 1727204697.95668: exiting _queue_task() for managed-node3/debug 49116 1727204697.95681: done queuing things up, now waiting for results queue to drain 49116 1727204697.95682: waiting for pending results... 49116 1727204697.96003: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 49116 1727204697.96102: in run() - task 127b8e07-fff9-02f7-957b-00000000002a 49116 1727204697.96111: variable 'ansible_search_path' from source: unknown 49116 1727204697.96173: variable 'ansible_search_path' from source: unknown 49116 1727204697.96177: calling self._execute() 49116 1727204697.96309: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204697.96327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204697.96346: variable 'omit' from source: magic vars 49116 1727204697.96828: variable 'ansible_distribution_major_version' from source: facts 49116 1727204697.96858: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204697.97012: variable 'network_state' from source: role '' defaults 49116 1727204697.97065: Evaluated conditional (network_state != {}): False 49116 1727204697.97074: when evaluation is False, skipping this task 49116 1727204697.97078: _execute() done 49116 1727204697.97080: dumping result to json 49116 1727204697.97083: done dumping result, returning 49116 1727204697.97086: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-02f7-957b-00000000002a] 49116 1727204697.97088: sending task result for task 127b8e07-fff9-02f7-957b-00000000002a skipping: [managed-node3] => { "false_condition": "network_state != {}" } 49116 1727204697.97336: no more pending results, returning what we have 49116 1727204697.97341: results queue empty 49116 1727204697.97342: checking for any_errors_fatal 49116 1727204697.97355: done checking for any_errors_fatal 49116 1727204697.97356: checking for max_fail_percentage 49116 1727204697.97358: done checking for max_fail_percentage 49116 1727204697.97359: checking to see if all hosts have failed and the running result is not ok 49116 1727204697.97360: done checking to see if all hosts have failed 49116 1727204697.97361: getting the remaining hosts for this loop 49116 1727204697.97363: done getting the remaining hosts for this loop 49116 1727204697.97369: getting the next task for host managed-node3 49116 1727204697.97378: done getting next task for host managed-node3 49116 1727204697.97383: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 49116 1727204697.97388: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204697.97406: getting variables 49116 1727204697.97408: in VariableManager get_vars() 49116 1727204697.97462: Calling all_inventory to load vars for managed-node3 49116 1727204697.97669: Calling groups_inventory to load vars for managed-node3 49116 1727204697.97673: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204697.97680: done sending task result for task 127b8e07-fff9-02f7-957b-00000000002a 49116 1727204697.97683: WORKER PROCESS EXITING 49116 1727204697.97699: Calling all_plugins_play to load vars for managed-node3 49116 1727204697.97702: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204697.97706: Calling groups_plugins_play to load vars for managed-node3 49116 1727204697.99588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204698.00805: done with get_vars() 49116 1727204698.00839: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:04:58 -0400 (0:00:00.057) 0:00:21.034 ***** 49116 1727204698.00919: entering _queue_task() for managed-node3/ping 49116 1727204698.00921: Creating lock for ping 49116 1727204698.01247: worker is 1 (out of 1 available) 49116 1727204698.01261: exiting _queue_task() for managed-node3/ping 49116 1727204698.01276: done queuing things up, now waiting for results queue to drain 49116 1727204698.01277: waiting for pending results... 49116 1727204698.01689: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 49116 1727204698.01760: in run() - task 127b8e07-fff9-02f7-957b-00000000002b 49116 1727204698.01791: variable 'ansible_search_path' from source: unknown 49116 1727204698.01799: variable 'ansible_search_path' from source: unknown 49116 1727204698.01846: calling self._execute() 49116 1727204698.01958: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204698.01972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204698.01987: variable 'omit' from source: magic vars 49116 1727204698.02420: variable 'ansible_distribution_major_version' from source: facts 49116 1727204698.02438: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204698.02443: variable 'omit' from source: magic vars 49116 1727204698.02487: variable 'omit' from source: magic vars 49116 1727204698.02516: variable 'omit' from source: magic vars 49116 1727204698.02554: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204698.02588: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204698.02604: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204698.02631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204698.02642: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204698.02669: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204698.02673: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204698.02675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204698.02749: Set connection var ansible_connection to ssh 49116 1727204698.02762: Set connection var ansible_timeout to 10 49116 1727204698.02770: Set connection var ansible_shell_executable to /bin/sh 49116 1727204698.02776: Set connection var ansible_pipelining to False 49116 1727204698.02778: Set connection var ansible_shell_type to sh 49116 1727204698.02784: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204698.02802: variable 'ansible_shell_executable' from source: unknown 49116 1727204698.02805: variable 'ansible_connection' from source: unknown 49116 1727204698.02808: variable 'ansible_module_compression' from source: unknown 49116 1727204698.02811: variable 'ansible_shell_type' from source: unknown 49116 1727204698.02813: variable 'ansible_shell_executable' from source: unknown 49116 1727204698.02816: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204698.02820: variable 'ansible_pipelining' from source: unknown 49116 1727204698.02822: variable 'ansible_timeout' from source: unknown 49116 1727204698.02827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204698.02989: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49116 1727204698.02999: variable 'omit' from source: magic vars 49116 1727204698.03004: starting attempt loop 49116 1727204698.03008: running the handler 49116 1727204698.03020: _low_level_execute_command(): starting 49116 1727204698.03027: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204698.03568: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204698.03598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204698.03602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 49116 1727204698.03605: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204698.03607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204698.03669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204698.03673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204698.03676: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204698.03758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204698.05628: stdout chunk (state=3): >>>/root <<< 49116 1727204698.05844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204698.05849: stdout chunk (state=3): >>><<< 49116 1727204698.05852: stderr chunk (state=3): >>><<< 49116 1727204698.05878: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204698.05901: _low_level_execute_command(): starting 49116 1727204698.05916: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204698.05887-50550-244788331386409 `" && echo ansible-tmp-1727204698.05887-50550-244788331386409="` echo /root/.ansible/tmp/ansible-tmp-1727204698.05887-50550-244788331386409 `" ) && sleep 0' 49116 1727204698.06783: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204698.06847: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204698.06930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204698.09124: stdout chunk (state=3): >>>ansible-tmp-1727204698.05887-50550-244788331386409=/root/.ansible/tmp/ansible-tmp-1727204698.05887-50550-244788331386409 <<< 49116 1727204698.09347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204698.09352: stdout chunk (state=3): >>><<< 49116 1727204698.09354: stderr chunk (state=3): >>><<< 49116 1727204698.09573: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204698.05887-50550-244788331386409=/root/.ansible/tmp/ansible-tmp-1727204698.05887-50550-244788331386409 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204698.09577: variable 'ansible_module_compression' from source: unknown 49116 1727204698.09580: ANSIBALLZ: Using lock for ping 49116 1727204698.09582: ANSIBALLZ: Acquiring lock 49116 1727204698.09585: ANSIBALLZ: Lock acquired: 139720118195344 49116 1727204698.09587: ANSIBALLZ: Creating module 49116 1727204698.34739: ANSIBALLZ: Writing module into payload 49116 1727204698.34832: ANSIBALLZ: Writing module 49116 1727204698.34865: ANSIBALLZ: Renaming module 49116 1727204698.35008: ANSIBALLZ: Done creating module 49116 1727204698.35032: variable 'ansible_facts' from source: unknown 49116 1727204698.35214: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204698.05887-50550-244788331386409/AnsiballZ_ping.py 49116 1727204698.35351: Sending initial data 49116 1727204698.35355: Sent initial data (151 bytes) 49116 1727204698.36106: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204698.36157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204698.36186: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204698.36296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204698.38135: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204698.38234: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204698.38323: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpm0vks3up /root/.ansible/tmp/ansible-tmp-1727204698.05887-50550-244788331386409/AnsiballZ_ping.py <<< 49116 1727204698.38327: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204698.05887-50550-244788331386409/AnsiballZ_ping.py" <<< 49116 1727204698.38400: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpm0vks3up" to remote "/root/.ansible/tmp/ansible-tmp-1727204698.05887-50550-244788331386409/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204698.05887-50550-244788331386409/AnsiballZ_ping.py" <<< 49116 1727204698.39255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204698.39357: stderr chunk (state=3): >>><<< 49116 1727204698.39361: stdout chunk (state=3): >>><<< 49116 1727204698.39460: done transferring module to remote 49116 1727204698.39465: _low_level_execute_command(): starting 49116 1727204698.39472: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204698.05887-50550-244788331386409/ /root/.ansible/tmp/ansible-tmp-1727204698.05887-50550-244788331386409/AnsiballZ_ping.py && sleep 0' 49116 1727204698.40047: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204698.40056: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204698.40070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204698.40088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204698.40100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204698.40107: stderr chunk (state=3): >>>debug2: match not found <<< 49116 1727204698.40117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204698.40138: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 49116 1727204698.40142: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 49116 1727204698.40145: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 49116 1727204698.40153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204698.40162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204698.40177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204698.40185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204698.40243: stderr chunk (state=3): >>>debug2: match found <<< 49116 1727204698.40251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204698.40269: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204698.40333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204698.40337: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204698.40413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204698.42437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204698.42546: stderr chunk (state=3): >>><<< 49116 1727204698.42550: stdout chunk (state=3): >>><<< 49116 1727204698.42553: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204698.42555: _low_level_execute_command(): starting 49116 1727204698.42558: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204698.05887-50550-244788331386409/AnsiballZ_ping.py && sleep 0' 49116 1727204698.43056: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204698.43060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49116 1727204698.43062: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204698.43067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204698.43124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204698.43128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204698.43133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204698.43212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204698.60631: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 49116 1727204698.62194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204698.62257: stderr chunk (state=3): >>><<< 49116 1727204698.62262: stdout chunk (state=3): >>><<< 49116 1727204698.62279: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204698.62300: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204698.05887-50550-244788331386409/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204698.62308: _low_level_execute_command(): starting 49116 1727204698.62313: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204698.05887-50550-244788331386409/ > /dev/null 2>&1 && sleep 0' 49116 1727204698.62821: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204698.62825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204698.62827: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204698.62831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204698.62888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204698.62892: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204698.62896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204698.62970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204698.64989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204698.65054: stderr chunk (state=3): >>><<< 49116 1727204698.65057: stdout chunk (state=3): >>><<< 49116 1727204698.65077: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204698.65082: handler run complete 49116 1727204698.65095: attempt loop complete, returning result 49116 1727204698.65097: _execute() done 49116 1727204698.65100: dumping result to json 49116 1727204698.65107: done dumping result, returning 49116 1727204698.65114: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-02f7-957b-00000000002b] 49116 1727204698.65116: sending task result for task 127b8e07-fff9-02f7-957b-00000000002b 49116 1727204698.65212: done sending task result for task 127b8e07-fff9-02f7-957b-00000000002b 49116 1727204698.65216: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 49116 1727204698.65277: no more pending results, returning what we have 49116 1727204698.65280: results queue empty 49116 1727204698.65281: checking for any_errors_fatal 49116 1727204698.65286: done checking for any_errors_fatal 49116 1727204698.65287: checking for max_fail_percentage 49116 1727204698.65289: done checking for max_fail_percentage 49116 1727204698.65290: checking to see if all hosts have failed and the running result is not ok 49116 1727204698.65290: done checking to see if all hosts have failed 49116 1727204698.65291: getting the remaining hosts for this loop 49116 1727204698.65293: done getting the remaining hosts for this loop 49116 1727204698.65297: getting the next task for host managed-node3 49116 1727204698.65306: done getting next task for host managed-node3 49116 1727204698.65308: ^ task is: TASK: meta (role_complete) 49116 1727204698.65311: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204698.65321: getting variables 49116 1727204698.65323: in VariableManager get_vars() 49116 1727204698.65364: Calling all_inventory to load vars for managed-node3 49116 1727204698.65369: Calling groups_inventory to load vars for managed-node3 49116 1727204698.65371: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204698.65381: Calling all_plugins_play to load vars for managed-node3 49116 1727204698.65384: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204698.65387: Calling groups_plugins_play to load vars for managed-node3 49116 1727204698.66474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204698.67669: done with get_vars() 49116 1727204698.67695: done getting variables 49116 1727204698.67769: done queuing things up, now waiting for results queue to drain 49116 1727204698.67771: results queue empty 49116 1727204698.67772: checking for any_errors_fatal 49116 1727204698.67774: done checking for any_errors_fatal 49116 1727204698.67775: checking for max_fail_percentage 49116 1727204698.67775: done checking for max_fail_percentage 49116 1727204698.67776: checking to see if all hosts have failed and the running result is not ok 49116 1727204698.67776: done checking to see if all hosts have failed 49116 1727204698.67777: getting the remaining hosts for this loop 49116 1727204698.67777: done getting the remaining hosts for this loop 49116 1727204698.67780: getting the next task for host managed-node3 49116 1727204698.67783: done getting next task for host managed-node3 49116 1727204698.67785: ^ task is: TASK: Include the task 'assert_device_present.yml' 49116 1727204698.67786: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204698.67788: getting variables 49116 1727204698.67789: in VariableManager get_vars() 49116 1727204698.67800: Calling all_inventory to load vars for managed-node3 49116 1727204698.67801: Calling groups_inventory to load vars for managed-node3 49116 1727204698.67803: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204698.67806: Calling all_plugins_play to load vars for managed-node3 49116 1727204698.67808: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204698.67810: Calling groups_plugins_play to load vars for managed-node3 49116 1727204698.72819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204698.74003: done with get_vars() 49116 1727204698.74030: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:46 Tuesday 24 September 2024 15:04:58 -0400 (0:00:00.731) 0:00:21.765 ***** 49116 1727204698.74094: entering _queue_task() for managed-node3/include_tasks 49116 1727204698.74437: worker is 1 (out of 1 available) 49116 1727204698.74451: exiting _queue_task() for managed-node3/include_tasks 49116 1727204698.74467: done queuing things up, now waiting for results queue to drain 49116 1727204698.74469: waiting for pending results... 49116 1727204698.74675: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' 49116 1727204698.74755: in run() - task 127b8e07-fff9-02f7-957b-00000000005b 49116 1727204698.74768: variable 'ansible_search_path' from source: unknown 49116 1727204698.74808: calling self._execute() 49116 1727204698.74891: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204698.74895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204698.74906: variable 'omit' from source: magic vars 49116 1727204698.75246: variable 'ansible_distribution_major_version' from source: facts 49116 1727204698.75263: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204698.75268: _execute() done 49116 1727204698.75271: dumping result to json 49116 1727204698.75276: done dumping result, returning 49116 1727204698.75283: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_present.yml' [127b8e07-fff9-02f7-957b-00000000005b] 49116 1727204698.75288: sending task result for task 127b8e07-fff9-02f7-957b-00000000005b 49116 1727204698.75397: done sending task result for task 127b8e07-fff9-02f7-957b-00000000005b 49116 1727204698.75400: WORKER PROCESS EXITING 49116 1727204698.75433: no more pending results, returning what we have 49116 1727204698.75439: in VariableManager get_vars() 49116 1727204698.75491: Calling all_inventory to load vars for managed-node3 49116 1727204698.75494: Calling groups_inventory to load vars for managed-node3 49116 1727204698.75496: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204698.75513: Calling all_plugins_play to load vars for managed-node3 49116 1727204698.75517: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204698.75520: Calling groups_plugins_play to load vars for managed-node3 49116 1727204698.76654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204698.77863: done with get_vars() 49116 1727204698.77889: variable 'ansible_search_path' from source: unknown 49116 1727204698.77904: we have included files to process 49116 1727204698.77905: generating all_blocks data 49116 1727204698.77906: done generating all_blocks data 49116 1727204698.77910: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 49116 1727204698.77911: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 49116 1727204698.77912: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 49116 1727204698.78000: in VariableManager get_vars() 49116 1727204698.78021: done with get_vars() 49116 1727204698.78114: done processing included file 49116 1727204698.78116: iterating over new_blocks loaded from include file 49116 1727204698.78117: in VariableManager get_vars() 49116 1727204698.78131: done with get_vars() 49116 1727204698.78132: filtering new block on tags 49116 1727204698.78147: done filtering new block on tags 49116 1727204698.78148: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node3 49116 1727204698.78152: extending task lists for all hosts with included blocks 49116 1727204698.79829: done extending task lists 49116 1727204698.79831: done processing included files 49116 1727204698.79832: results queue empty 49116 1727204698.79832: checking for any_errors_fatal 49116 1727204698.79834: done checking for any_errors_fatal 49116 1727204698.79835: checking for max_fail_percentage 49116 1727204698.79837: done checking for max_fail_percentage 49116 1727204698.79837: checking to see if all hosts have failed and the running result is not ok 49116 1727204698.79838: done checking to see if all hosts have failed 49116 1727204698.79839: getting the remaining hosts for this loop 49116 1727204698.79840: done getting the remaining hosts for this loop 49116 1727204698.79842: getting the next task for host managed-node3 49116 1727204698.79845: done getting next task for host managed-node3 49116 1727204698.79847: ^ task is: TASK: Include the task 'get_interface_stat.yml' 49116 1727204698.79848: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204698.79850: getting variables 49116 1727204698.79851: in VariableManager get_vars() 49116 1727204698.79868: Calling all_inventory to load vars for managed-node3 49116 1727204698.79870: Calling groups_inventory to load vars for managed-node3 49116 1727204698.79871: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204698.79879: Calling all_plugins_play to load vars for managed-node3 49116 1727204698.79880: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204698.79882: Calling groups_plugins_play to load vars for managed-node3 49116 1727204698.80755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204698.82061: done with get_vars() 49116 1727204698.82085: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 15:04:58 -0400 (0:00:00.080) 0:00:21.846 ***** 49116 1727204698.82150: entering _queue_task() for managed-node3/include_tasks 49116 1727204698.82449: worker is 1 (out of 1 available) 49116 1727204698.82464: exiting _queue_task() for managed-node3/include_tasks 49116 1727204698.82480: done queuing things up, now waiting for results queue to drain 49116 1727204698.82481: waiting for pending results... 49116 1727204698.82681: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 49116 1727204698.82756: in run() - task 127b8e07-fff9-02f7-957b-000000000578 49116 1727204698.82769: variable 'ansible_search_path' from source: unknown 49116 1727204698.82773: variable 'ansible_search_path' from source: unknown 49116 1727204698.82807: calling self._execute() 49116 1727204698.82890: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204698.82894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204698.82906: variable 'omit' from source: magic vars 49116 1727204698.83233: variable 'ansible_distribution_major_version' from source: facts 49116 1727204698.83247: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204698.83251: _execute() done 49116 1727204698.83255: dumping result to json 49116 1727204698.83260: done dumping result, returning 49116 1727204698.83264: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-02f7-957b-000000000578] 49116 1727204698.83277: sending task result for task 127b8e07-fff9-02f7-957b-000000000578 49116 1727204698.83369: done sending task result for task 127b8e07-fff9-02f7-957b-000000000578 49116 1727204698.83372: WORKER PROCESS EXITING 49116 1727204698.83405: no more pending results, returning what we have 49116 1727204698.83411: in VariableManager get_vars() 49116 1727204698.83461: Calling all_inventory to load vars for managed-node3 49116 1727204698.83464: Calling groups_inventory to load vars for managed-node3 49116 1727204698.83468: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204698.83484: Calling all_plugins_play to load vars for managed-node3 49116 1727204698.83487: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204698.83490: Calling groups_plugins_play to load vars for managed-node3 49116 1727204698.84539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204698.85750: done with get_vars() 49116 1727204698.85775: variable 'ansible_search_path' from source: unknown 49116 1727204698.85776: variable 'ansible_search_path' from source: unknown 49116 1727204698.85807: we have included files to process 49116 1727204698.85808: generating all_blocks data 49116 1727204698.85810: done generating all_blocks data 49116 1727204698.85810: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 49116 1727204698.85811: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 49116 1727204698.85813: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 49116 1727204698.85959: done processing included file 49116 1727204698.85961: iterating over new_blocks loaded from include file 49116 1727204698.85963: in VariableManager get_vars() 49116 1727204698.85981: done with get_vars() 49116 1727204698.85982: filtering new block on tags 49116 1727204698.85994: done filtering new block on tags 49116 1727204698.85995: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 49116 1727204698.85999: extending task lists for all hosts with included blocks 49116 1727204698.86073: done extending task lists 49116 1727204698.86074: done processing included files 49116 1727204698.86074: results queue empty 49116 1727204698.86075: checking for any_errors_fatal 49116 1727204698.86078: done checking for any_errors_fatal 49116 1727204698.86078: checking for max_fail_percentage 49116 1727204698.86079: done checking for max_fail_percentage 49116 1727204698.86080: checking to see if all hosts have failed and the running result is not ok 49116 1727204698.86080: done checking to see if all hosts have failed 49116 1727204698.86081: getting the remaining hosts for this loop 49116 1727204698.86082: done getting the remaining hosts for this loop 49116 1727204698.86083: getting the next task for host managed-node3 49116 1727204698.86086: done getting next task for host managed-node3 49116 1727204698.86088: ^ task is: TASK: Get stat for interface {{ interface }} 49116 1727204698.86090: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204698.86092: getting variables 49116 1727204698.86092: in VariableManager get_vars() 49116 1727204698.86103: Calling all_inventory to load vars for managed-node3 49116 1727204698.86104: Calling groups_inventory to load vars for managed-node3 49116 1727204698.86106: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204698.86110: Calling all_plugins_play to load vars for managed-node3 49116 1727204698.86111: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204698.86113: Calling groups_plugins_play to load vars for managed-node3 49116 1727204698.87030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204698.88206: done with get_vars() 49116 1727204698.88232: done getting variables 49116 1727204698.88373: variable 'interface' from source: include params 49116 1727204698.88376: variable 'vlan_interface' from source: play vars 49116 1727204698.88425: variable 'vlan_interface' from source: play vars TASK [Get stat for interface lsr101.90] **************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:04:58 -0400 (0:00:00.062) 0:00:21.909 ***** 49116 1727204698.88452: entering _queue_task() for managed-node3/stat 49116 1727204698.88746: worker is 1 (out of 1 available) 49116 1727204698.88762: exiting _queue_task() for managed-node3/stat 49116 1727204698.88777: done queuing things up, now waiting for results queue to drain 49116 1727204698.88779: waiting for pending results... 49116 1727204698.88984: running TaskExecutor() for managed-node3/TASK: Get stat for interface lsr101.90 49116 1727204698.89072: in run() - task 127b8e07-fff9-02f7-957b-00000000069c 49116 1727204698.89085: variable 'ansible_search_path' from source: unknown 49116 1727204698.89090: variable 'ansible_search_path' from source: unknown 49116 1727204698.89126: calling self._execute() 49116 1727204698.89209: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204698.89218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204698.89230: variable 'omit' from source: magic vars 49116 1727204698.89549: variable 'ansible_distribution_major_version' from source: facts 49116 1727204698.89566: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204698.89571: variable 'omit' from source: magic vars 49116 1727204698.89605: variable 'omit' from source: magic vars 49116 1727204698.89689: variable 'interface' from source: include params 49116 1727204698.89693: variable 'vlan_interface' from source: play vars 49116 1727204698.89742: variable 'vlan_interface' from source: play vars 49116 1727204698.89759: variable 'omit' from source: magic vars 49116 1727204698.89799: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204698.89830: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204698.89850: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204698.89867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204698.89880: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204698.89906: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204698.89909: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204698.89911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204698.89987: Set connection var ansible_connection to ssh 49116 1727204698.90002: Set connection var ansible_timeout to 10 49116 1727204698.90008: Set connection var ansible_shell_executable to /bin/sh 49116 1727204698.90014: Set connection var ansible_pipelining to False 49116 1727204698.90018: Set connection var ansible_shell_type to sh 49116 1727204698.90023: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204698.90045: variable 'ansible_shell_executable' from source: unknown 49116 1727204698.90048: variable 'ansible_connection' from source: unknown 49116 1727204698.90051: variable 'ansible_module_compression' from source: unknown 49116 1727204698.90053: variable 'ansible_shell_type' from source: unknown 49116 1727204698.90056: variable 'ansible_shell_executable' from source: unknown 49116 1727204698.90058: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204698.90063: variable 'ansible_pipelining' from source: unknown 49116 1727204698.90068: variable 'ansible_timeout' from source: unknown 49116 1727204698.90072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204698.90241: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49116 1727204698.90251: variable 'omit' from source: magic vars 49116 1727204698.90257: starting attempt loop 49116 1727204698.90260: running the handler 49116 1727204698.90274: _low_level_execute_command(): starting 49116 1727204698.90280: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204698.90842: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204698.90846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49116 1727204698.90850: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 49116 1727204698.90854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204698.90908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204698.90916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204698.90920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204698.90992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204698.92850: stdout chunk (state=3): >>>/root <<< 49116 1727204698.92952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204698.93019: stderr chunk (state=3): >>><<< 49116 1727204698.93022: stdout chunk (state=3): >>><<< 49116 1727204698.93045: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204698.93058: _low_level_execute_command(): starting 49116 1727204698.93065: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204698.930447-50572-115232844329018 `" && echo ansible-tmp-1727204698.930447-50572-115232844329018="` echo /root/.ansible/tmp/ansible-tmp-1727204698.930447-50572-115232844329018 `" ) && sleep 0' 49116 1727204698.93892: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204698.93974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204698.96189: stdout chunk (state=3): >>>ansible-tmp-1727204698.930447-50572-115232844329018=/root/.ansible/tmp/ansible-tmp-1727204698.930447-50572-115232844329018 <<< 49116 1727204698.96313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204698.96374: stderr chunk (state=3): >>><<< 49116 1727204698.96377: stdout chunk (state=3): >>><<< 49116 1727204698.96395: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204698.930447-50572-115232844329018=/root/.ansible/tmp/ansible-tmp-1727204698.930447-50572-115232844329018 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204698.96447: variable 'ansible_module_compression' from source: unknown 49116 1727204698.96493: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 49116 1727204698.96533: variable 'ansible_facts' from source: unknown 49116 1727204698.96600: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204698.930447-50572-115232844329018/AnsiballZ_stat.py 49116 1727204698.96716: Sending initial data 49116 1727204698.96719: Sent initial data (152 bytes) 49116 1727204698.97318: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204698.97322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204698.97325: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204698.97327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204698.97423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204698.97494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204698.99346: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204698.99430: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204698.99506: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmp06_ll_9v /root/.ansible/tmp/ansible-tmp-1727204698.930447-50572-115232844329018/AnsiballZ_stat.py <<< 49116 1727204698.99509: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204698.930447-50572-115232844329018/AnsiballZ_stat.py" <<< 49116 1727204698.99584: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmp06_ll_9v" to remote "/root/.ansible/tmp/ansible-tmp-1727204698.930447-50572-115232844329018/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204698.930447-50572-115232844329018/AnsiballZ_stat.py" <<< 49116 1727204699.00575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204699.00579: stdout chunk (state=3): >>><<< 49116 1727204699.00583: stderr chunk (state=3): >>><<< 49116 1727204699.00586: done transferring module to remote 49116 1727204699.00588: _low_level_execute_command(): starting 49116 1727204699.00590: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204698.930447-50572-115232844329018/ /root/.ansible/tmp/ansible-tmp-1727204698.930447-50572-115232844329018/AnsiballZ_stat.py && sleep 0' 49116 1727204699.01279: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204699.01399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204699.01429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204699.01530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204699.03771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204699.03777: stdout chunk (state=3): >>><<< 49116 1727204699.03780: stderr chunk (state=3): >>><<< 49116 1727204699.03784: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204699.03793: _low_level_execute_command(): starting 49116 1727204699.03795: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204698.930447-50572-115232844329018/AnsiballZ_stat.py && sleep 0' 49116 1727204699.04390: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204699.04406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204699.04424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204699.04446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204699.04468: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204699.04482: stderr chunk (state=3): >>>debug2: match not found <<< 49116 1727204699.04577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204699.04597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204699.04614: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204699.04638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204699.04756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204699.22590: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101.90", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 42379, "dev": 23, "nlink": 1, "atime": 1727204697.6535509, "mtime": 1727204697.6535509, "ctime": 1727204697.6535509, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101.90", "lnk_target": "../../devices/virtual/net/lsr101.90", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} <<< 49116 1727204699.24236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204699.24249: stdout chunk (state=3): >>><<< 49116 1727204699.24270: stderr chunk (state=3): >>><<< 49116 1727204699.24448: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101.90", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 42379, "dev": 23, "nlink": 1, "atime": 1727204697.6535509, "mtime": 1727204697.6535509, "ctime": 1727204697.6535509, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101.90", "lnk_target": "../../devices/virtual/net/lsr101.90", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204699.24453: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr101.90', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204698.930447-50572-115232844329018/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204699.24455: _low_level_execute_command(): starting 49116 1727204699.24458: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204698.930447-50572-115232844329018/ > /dev/null 2>&1 && sleep 0' 49116 1727204699.25088: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204699.25107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204699.25124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204699.25139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204699.25255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204699.27407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204699.27425: stdout chunk (state=3): >>><<< 49116 1727204699.27448: stderr chunk (state=3): >>><<< 49116 1727204699.27473: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204699.27485: handler run complete 49116 1727204699.27678: attempt loop complete, returning result 49116 1727204699.27681: _execute() done 49116 1727204699.27684: dumping result to json 49116 1727204699.27686: done dumping result, returning 49116 1727204699.27687: done running TaskExecutor() for managed-node3/TASK: Get stat for interface lsr101.90 [127b8e07-fff9-02f7-957b-00000000069c] 49116 1727204699.27689: sending task result for task 127b8e07-fff9-02f7-957b-00000000069c 49116 1727204699.27772: done sending task result for task 127b8e07-fff9-02f7-957b-00000000069c 49116 1727204699.27775: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727204697.6535509, "block_size": 4096, "blocks": 0, "ctime": 1727204697.6535509, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 42379, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr101.90", "lnk_target": "../../devices/virtual/net/lsr101.90", "mode": "0777", "mtime": 1727204697.6535509, "nlink": 1, "path": "/sys/class/net/lsr101.90", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 49116 1727204699.27888: no more pending results, returning what we have 49116 1727204699.27892: results queue empty 49116 1727204699.27893: checking for any_errors_fatal 49116 1727204699.27895: done checking for any_errors_fatal 49116 1727204699.27895: checking for max_fail_percentage 49116 1727204699.27898: done checking for max_fail_percentage 49116 1727204699.27899: checking to see if all hosts have failed and the running result is not ok 49116 1727204699.27900: done checking to see if all hosts have failed 49116 1727204699.27900: getting the remaining hosts for this loop 49116 1727204699.27902: done getting the remaining hosts for this loop 49116 1727204699.27906: getting the next task for host managed-node3 49116 1727204699.27917: done getting next task for host managed-node3 49116 1727204699.27919: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 49116 1727204699.27923: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204699.27928: getting variables 49116 1727204699.27930: in VariableManager get_vars() 49116 1727204699.28089: Calling all_inventory to load vars for managed-node3 49116 1727204699.28092: Calling groups_inventory to load vars for managed-node3 49116 1727204699.28095: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204699.28110: Calling all_plugins_play to load vars for managed-node3 49116 1727204699.28113: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204699.28117: Calling groups_plugins_play to load vars for managed-node3 49116 1727204699.30455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204699.33105: done with get_vars() 49116 1727204699.33144: done getting variables 49116 1727204699.33227: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204699.33378: variable 'interface' from source: include params 49116 1727204699.33386: variable 'vlan_interface' from source: play vars 49116 1727204699.33454: variable 'vlan_interface' from source: play vars TASK [Assert that the interface is present - 'lsr101.90'] ********************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 15:04:59 -0400 (0:00:00.450) 0:00:22.359 ***** 49116 1727204699.33496: entering _queue_task() for managed-node3/assert 49116 1727204699.34079: worker is 1 (out of 1 available) 49116 1727204699.34091: exiting _queue_task() for managed-node3/assert 49116 1727204699.34103: done queuing things up, now waiting for results queue to drain 49116 1727204699.34104: waiting for pending results... 49116 1727204699.34295: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'lsr101.90' 49116 1727204699.34447: in run() - task 127b8e07-fff9-02f7-957b-000000000579 49116 1727204699.34472: variable 'ansible_search_path' from source: unknown 49116 1727204699.34510: variable 'ansible_search_path' from source: unknown 49116 1727204699.34546: calling self._execute() 49116 1727204699.34676: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204699.34687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204699.34767: variable 'omit' from source: magic vars 49116 1727204699.35084: variable 'ansible_distribution_major_version' from source: facts 49116 1727204699.35098: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204699.35104: variable 'omit' from source: magic vars 49116 1727204699.35151: variable 'omit' from source: magic vars 49116 1727204699.35343: variable 'interface' from source: include params 49116 1727204699.35347: variable 'vlan_interface' from source: play vars 49116 1727204699.35351: variable 'vlan_interface' from source: play vars 49116 1727204699.35354: variable 'omit' from source: magic vars 49116 1727204699.35384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204699.35426: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204699.35453: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204699.35468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204699.35480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204699.35511: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204699.35514: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204699.35517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204699.35623: Set connection var ansible_connection to ssh 49116 1727204699.35638: Set connection var ansible_timeout to 10 49116 1727204699.35644: Set connection var ansible_shell_executable to /bin/sh 49116 1727204699.35650: Set connection var ansible_pipelining to False 49116 1727204699.35652: Set connection var ansible_shell_type to sh 49116 1727204699.35666: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204699.35849: variable 'ansible_shell_executable' from source: unknown 49116 1727204699.35852: variable 'ansible_connection' from source: unknown 49116 1727204699.35855: variable 'ansible_module_compression' from source: unknown 49116 1727204699.35857: variable 'ansible_shell_type' from source: unknown 49116 1727204699.35859: variable 'ansible_shell_executable' from source: unknown 49116 1727204699.35861: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204699.35864: variable 'ansible_pipelining' from source: unknown 49116 1727204699.35927: variable 'ansible_timeout' from source: unknown 49116 1727204699.35930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204699.35936: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204699.35939: variable 'omit' from source: magic vars 49116 1727204699.35947: starting attempt loop 49116 1727204699.35950: running the handler 49116 1727204699.36058: variable 'interface_stat' from source: set_fact 49116 1727204699.36064: Evaluated conditional (interface_stat.stat.exists): True 49116 1727204699.36071: handler run complete 49116 1727204699.36075: attempt loop complete, returning result 49116 1727204699.36078: _execute() done 49116 1727204699.36081: dumping result to json 49116 1727204699.36083: done dumping result, returning 49116 1727204699.36086: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'lsr101.90' [127b8e07-fff9-02f7-957b-000000000579] 49116 1727204699.36088: sending task result for task 127b8e07-fff9-02f7-957b-000000000579 ok: [managed-node3] => { "changed": false } MSG: All assertions passed 49116 1727204699.36338: no more pending results, returning what we have 49116 1727204699.36341: results queue empty 49116 1727204699.36342: checking for any_errors_fatal 49116 1727204699.36355: done checking for any_errors_fatal 49116 1727204699.36356: checking for max_fail_percentage 49116 1727204699.36358: done checking for max_fail_percentage 49116 1727204699.36358: checking to see if all hosts have failed and the running result is not ok 49116 1727204699.36359: done checking to see if all hosts have failed 49116 1727204699.36360: getting the remaining hosts for this loop 49116 1727204699.36361: done getting the remaining hosts for this loop 49116 1727204699.36367: getting the next task for host managed-node3 49116 1727204699.36374: done getting next task for host managed-node3 49116 1727204699.36380: ^ task is: TASK: Include the task 'assert_profile_present.yml' 49116 1727204699.36382: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204699.36386: getting variables 49116 1727204699.36387: in VariableManager get_vars() 49116 1727204699.36427: Calling all_inventory to load vars for managed-node3 49116 1727204699.36430: Calling groups_inventory to load vars for managed-node3 49116 1727204699.36432: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204699.36445: Calling all_plugins_play to load vars for managed-node3 49116 1727204699.36448: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204699.36452: Calling groups_plugins_play to load vars for managed-node3 49116 1727204699.36464: done sending task result for task 127b8e07-fff9-02f7-957b-000000000579 49116 1727204699.36470: WORKER PROCESS EXITING 49116 1727204699.38151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204699.40356: done with get_vars() 49116 1727204699.40397: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:50 Tuesday 24 September 2024 15:04:59 -0400 (0:00:00.070) 0:00:22.430 ***** 49116 1727204699.40505: entering _queue_task() for managed-node3/include_tasks 49116 1727204699.40895: worker is 1 (out of 1 available) 49116 1727204699.40909: exiting _queue_task() for managed-node3/include_tasks 49116 1727204699.40922: done queuing things up, now waiting for results queue to drain 49116 1727204699.40923: waiting for pending results... 49116 1727204699.41252: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_present.yml' 49116 1727204699.41373: in run() - task 127b8e07-fff9-02f7-957b-00000000005c 49116 1727204699.41381: variable 'ansible_search_path' from source: unknown 49116 1727204699.41410: variable 'interface' from source: play vars 49116 1727204699.41754: variable 'interface' from source: play vars 49116 1727204699.41758: variable 'vlan_interface' from source: play vars 49116 1727204699.41760: variable 'vlan_interface' from source: play vars 49116 1727204699.41764: variable 'omit' from source: magic vars 49116 1727204699.42073: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204699.42078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204699.42081: variable 'omit' from source: magic vars 49116 1727204699.42206: variable 'ansible_distribution_major_version' from source: facts 49116 1727204699.42215: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204699.42249: variable 'item' from source: unknown 49116 1727204699.42323: variable 'item' from source: unknown 49116 1727204699.42610: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204699.42639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204699.42642: variable 'omit' from source: magic vars 49116 1727204699.42646: variable 'ansible_distribution_major_version' from source: facts 49116 1727204699.42650: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204699.42725: variable 'item' from source: unknown 49116 1727204699.42730: variable 'item' from source: unknown 49116 1727204699.42803: dumping result to json 49116 1727204699.42806: done dumping result, returning 49116 1727204699.42809: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_present.yml' [127b8e07-fff9-02f7-957b-00000000005c] 49116 1727204699.42810: sending task result for task 127b8e07-fff9-02f7-957b-00000000005c 49116 1727204699.42970: done sending task result for task 127b8e07-fff9-02f7-957b-00000000005c 49116 1727204699.42974: WORKER PROCESS EXITING 49116 1727204699.43008: no more pending results, returning what we have 49116 1727204699.43013: in VariableManager get_vars() 49116 1727204699.43067: Calling all_inventory to load vars for managed-node3 49116 1727204699.43071: Calling groups_inventory to load vars for managed-node3 49116 1727204699.43073: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204699.43089: Calling all_plugins_play to load vars for managed-node3 49116 1727204699.43093: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204699.43096: Calling groups_plugins_play to load vars for managed-node3 49116 1727204699.45107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204699.47284: done with get_vars() 49116 1727204699.47319: variable 'ansible_search_path' from source: unknown 49116 1727204699.47339: variable 'ansible_search_path' from source: unknown 49116 1727204699.47347: we have included files to process 49116 1727204699.47348: generating all_blocks data 49116 1727204699.47350: done generating all_blocks data 49116 1727204699.47353: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 49116 1727204699.47354: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 49116 1727204699.47357: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 49116 1727204699.47576: in VariableManager get_vars() 49116 1727204699.47603: done with get_vars() 49116 1727204699.47877: done processing included file 49116 1727204699.47880: iterating over new_blocks loaded from include file 49116 1727204699.47881: in VariableManager get_vars() 49116 1727204699.47902: done with get_vars() 49116 1727204699.47903: filtering new block on tags 49116 1727204699.47927: done filtering new block on tags 49116 1727204699.47929: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node3 => (item=lsr101) 49116 1727204699.47935: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 49116 1727204699.47936: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 49116 1727204699.47939: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 49116 1727204699.48049: in VariableManager get_vars() 49116 1727204699.48075: done with get_vars() 49116 1727204699.48310: done processing included file 49116 1727204699.48312: iterating over new_blocks loaded from include file 49116 1727204699.48313: in VariableManager get_vars() 49116 1727204699.48332: done with get_vars() 49116 1727204699.48334: filtering new block on tags 49116 1727204699.48354: done filtering new block on tags 49116 1727204699.48356: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node3 => (item=lsr101.90) 49116 1727204699.48361: extending task lists for all hosts with included blocks 49116 1727204699.51275: done extending task lists 49116 1727204699.51278: done processing included files 49116 1727204699.51278: results queue empty 49116 1727204699.51279: checking for any_errors_fatal 49116 1727204699.51284: done checking for any_errors_fatal 49116 1727204699.51285: checking for max_fail_percentage 49116 1727204699.51287: done checking for max_fail_percentage 49116 1727204699.51288: checking to see if all hosts have failed and the running result is not ok 49116 1727204699.51288: done checking to see if all hosts have failed 49116 1727204699.51289: getting the remaining hosts for this loop 49116 1727204699.51291: done getting the remaining hosts for this loop 49116 1727204699.51293: getting the next task for host managed-node3 49116 1727204699.51298: done getting next task for host managed-node3 49116 1727204699.51301: ^ task is: TASK: Include the task 'get_profile_stat.yml' 49116 1727204699.51303: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204699.51306: getting variables 49116 1727204699.51307: in VariableManager get_vars() 49116 1727204699.51328: Calling all_inventory to load vars for managed-node3 49116 1727204699.51331: Calling groups_inventory to load vars for managed-node3 49116 1727204699.51333: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204699.51341: Calling all_plugins_play to load vars for managed-node3 49116 1727204699.51344: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204699.51354: Calling groups_plugins_play to load vars for managed-node3 49116 1727204699.52975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204699.55177: done with get_vars() 49116 1727204699.55213: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 15:04:59 -0400 (0:00:00.147) 0:00:22.578 ***** 49116 1727204699.55305: entering _queue_task() for managed-node3/include_tasks 49116 1727204699.55704: worker is 1 (out of 1 available) 49116 1727204699.55718: exiting _queue_task() for managed-node3/include_tasks 49116 1727204699.55732: done queuing things up, now waiting for results queue to drain 49116 1727204699.55733: waiting for pending results... 49116 1727204699.56030: running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' 49116 1727204699.56174: in run() - task 127b8e07-fff9-02f7-957b-0000000006b8 49116 1727204699.56179: variable 'ansible_search_path' from source: unknown 49116 1727204699.56182: variable 'ansible_search_path' from source: unknown 49116 1727204699.56199: calling self._execute() 49116 1727204699.56372: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204699.56377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204699.56380: variable 'omit' from source: magic vars 49116 1727204699.56745: variable 'ansible_distribution_major_version' from source: facts 49116 1727204699.56758: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204699.56766: _execute() done 49116 1727204699.56770: dumping result to json 49116 1727204699.56773: done dumping result, returning 49116 1727204699.56779: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' [127b8e07-fff9-02f7-957b-0000000006b8] 49116 1727204699.56786: sending task result for task 127b8e07-fff9-02f7-957b-0000000006b8 49116 1727204699.56921: done sending task result for task 127b8e07-fff9-02f7-957b-0000000006b8 49116 1727204699.56926: WORKER PROCESS EXITING 49116 1727204699.56960: no more pending results, returning what we have 49116 1727204699.56967: in VariableManager get_vars() 49116 1727204699.57020: Calling all_inventory to load vars for managed-node3 49116 1727204699.57023: Calling groups_inventory to load vars for managed-node3 49116 1727204699.57025: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204699.57041: Calling all_plugins_play to load vars for managed-node3 49116 1727204699.57044: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204699.57046: Calling groups_plugins_play to load vars for managed-node3 49116 1727204699.59030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204699.61292: done with get_vars() 49116 1727204699.61325: variable 'ansible_search_path' from source: unknown 49116 1727204699.61326: variable 'ansible_search_path' from source: unknown 49116 1727204699.61374: we have included files to process 49116 1727204699.61376: generating all_blocks data 49116 1727204699.61377: done generating all_blocks data 49116 1727204699.61379: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 49116 1727204699.61380: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 49116 1727204699.61382: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 49116 1727204699.62521: done processing included file 49116 1727204699.62524: iterating over new_blocks loaded from include file 49116 1727204699.62525: in VariableManager get_vars() 49116 1727204699.62550: done with get_vars() 49116 1727204699.62552: filtering new block on tags 49116 1727204699.62580: done filtering new block on tags 49116 1727204699.62584: in VariableManager get_vars() 49116 1727204699.62604: done with get_vars() 49116 1727204699.62606: filtering new block on tags 49116 1727204699.62630: done filtering new block on tags 49116 1727204699.62632: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node3 49116 1727204699.62637: extending task lists for all hosts with included blocks 49116 1727204699.62824: done extending task lists 49116 1727204699.62826: done processing included files 49116 1727204699.62827: results queue empty 49116 1727204699.62827: checking for any_errors_fatal 49116 1727204699.62831: done checking for any_errors_fatal 49116 1727204699.62832: checking for max_fail_percentage 49116 1727204699.62833: done checking for max_fail_percentage 49116 1727204699.62834: checking to see if all hosts have failed and the running result is not ok 49116 1727204699.62835: done checking to see if all hosts have failed 49116 1727204699.62836: getting the remaining hosts for this loop 49116 1727204699.62837: done getting the remaining hosts for this loop 49116 1727204699.62840: getting the next task for host managed-node3 49116 1727204699.62844: done getting next task for host managed-node3 49116 1727204699.62846: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 49116 1727204699.62850: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204699.62852: getting variables 49116 1727204699.62853: in VariableManager get_vars() 49116 1727204699.62948: Calling all_inventory to load vars for managed-node3 49116 1727204699.62951: Calling groups_inventory to load vars for managed-node3 49116 1727204699.62953: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204699.62961: Calling all_plugins_play to load vars for managed-node3 49116 1727204699.62963: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204699.62969: Calling groups_plugins_play to load vars for managed-node3 49116 1727204699.64402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204699.66523: done with get_vars() 49116 1727204699.66562: done getting variables 49116 1727204699.66610: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:04:59 -0400 (0:00:00.113) 0:00:22.691 ***** 49116 1727204699.66642: entering _queue_task() for managed-node3/set_fact 49116 1727204699.67034: worker is 1 (out of 1 available) 49116 1727204699.67049: exiting _queue_task() for managed-node3/set_fact 49116 1727204699.67064: done queuing things up, now waiting for results queue to drain 49116 1727204699.67067: waiting for pending results... 49116 1727204699.67389: running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag 49116 1727204699.67555: in run() - task 127b8e07-fff9-02f7-957b-0000000007f0 49116 1727204699.67562: variable 'ansible_search_path' from source: unknown 49116 1727204699.67565: variable 'ansible_search_path' from source: unknown 49116 1727204699.67570: calling self._execute() 49116 1727204699.67670: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204699.67683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204699.67701: variable 'omit' from source: magic vars 49116 1727204699.68138: variable 'ansible_distribution_major_version' from source: facts 49116 1727204699.68158: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204699.68174: variable 'omit' from source: magic vars 49116 1727204699.68239: variable 'omit' from source: magic vars 49116 1727204699.68286: variable 'omit' from source: magic vars 49116 1727204699.68427: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204699.68431: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204699.68437: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204699.68452: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204699.68474: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204699.68516: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204699.68527: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204699.68544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204699.68662: Set connection var ansible_connection to ssh 49116 1727204699.68685: Set connection var ansible_timeout to 10 49116 1727204699.68700: Set connection var ansible_shell_executable to /bin/sh 49116 1727204699.68710: Set connection var ansible_pipelining to False 49116 1727204699.68717: Set connection var ansible_shell_type to sh 49116 1727204699.68727: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204699.68764: variable 'ansible_shell_executable' from source: unknown 49116 1727204699.68861: variable 'ansible_connection' from source: unknown 49116 1727204699.68868: variable 'ansible_module_compression' from source: unknown 49116 1727204699.68871: variable 'ansible_shell_type' from source: unknown 49116 1727204699.68874: variable 'ansible_shell_executable' from source: unknown 49116 1727204699.68876: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204699.68878: variable 'ansible_pipelining' from source: unknown 49116 1727204699.68880: variable 'ansible_timeout' from source: unknown 49116 1727204699.68882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204699.68984: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204699.69003: variable 'omit' from source: magic vars 49116 1727204699.69013: starting attempt loop 49116 1727204699.69020: running the handler 49116 1727204699.69040: handler run complete 49116 1727204699.69055: attempt loop complete, returning result 49116 1727204699.69061: _execute() done 49116 1727204699.69071: dumping result to json 49116 1727204699.69083: done dumping result, returning 49116 1727204699.69094: done running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag [127b8e07-fff9-02f7-957b-0000000007f0] 49116 1727204699.69104: sending task result for task 127b8e07-fff9-02f7-957b-0000000007f0 49116 1727204699.69301: done sending task result for task 127b8e07-fff9-02f7-957b-0000000007f0 49116 1727204699.69305: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 49116 1727204699.69370: no more pending results, returning what we have 49116 1727204699.69373: results queue empty 49116 1727204699.69374: checking for any_errors_fatal 49116 1727204699.69376: done checking for any_errors_fatal 49116 1727204699.69377: checking for max_fail_percentage 49116 1727204699.69379: done checking for max_fail_percentage 49116 1727204699.69380: checking to see if all hosts have failed and the running result is not ok 49116 1727204699.69381: done checking to see if all hosts have failed 49116 1727204699.69382: getting the remaining hosts for this loop 49116 1727204699.69383: done getting the remaining hosts for this loop 49116 1727204699.69390: getting the next task for host managed-node3 49116 1727204699.69399: done getting next task for host managed-node3 49116 1727204699.69402: ^ task is: TASK: Stat profile file 49116 1727204699.69407: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204699.69413: getting variables 49116 1727204699.69414: in VariableManager get_vars() 49116 1727204699.69460: Calling all_inventory to load vars for managed-node3 49116 1727204699.69463: Calling groups_inventory to load vars for managed-node3 49116 1727204699.69602: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204699.69616: Calling all_plugins_play to load vars for managed-node3 49116 1727204699.69619: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204699.69627: Calling groups_plugins_play to load vars for managed-node3 49116 1727204699.71498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204699.73843: done with get_vars() 49116 1727204699.73890: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:04:59 -0400 (0:00:00.073) 0:00:22.765 ***** 49116 1727204699.74011: entering _queue_task() for managed-node3/stat 49116 1727204699.74512: worker is 1 (out of 1 available) 49116 1727204699.74528: exiting _queue_task() for managed-node3/stat 49116 1727204699.74544: done queuing things up, now waiting for results queue to drain 49116 1727204699.74546: waiting for pending results... 49116 1727204699.74887: running TaskExecutor() for managed-node3/TASK: Stat profile file 49116 1727204699.75008: in run() - task 127b8e07-fff9-02f7-957b-0000000007f1 49116 1727204699.75012: variable 'ansible_search_path' from source: unknown 49116 1727204699.75016: variable 'ansible_search_path' from source: unknown 49116 1727204699.75031: calling self._execute() 49116 1727204699.75156: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204699.75171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204699.75187: variable 'omit' from source: magic vars 49116 1727204699.75627: variable 'ansible_distribution_major_version' from source: facts 49116 1727204699.75654: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204699.75764: variable 'omit' from source: magic vars 49116 1727204699.75768: variable 'omit' from source: magic vars 49116 1727204699.75854: variable 'profile' from source: include params 49116 1727204699.75864: variable 'item' from source: include params 49116 1727204699.75948: variable 'item' from source: include params 49116 1727204699.75976: variable 'omit' from source: magic vars 49116 1727204699.76037: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204699.76085: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204699.76120: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204699.76148: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204699.76168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204699.76318: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204699.76322: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204699.76325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204699.76354: Set connection var ansible_connection to ssh 49116 1727204699.76377: Set connection var ansible_timeout to 10 49116 1727204699.76392: Set connection var ansible_shell_executable to /bin/sh 49116 1727204699.76403: Set connection var ansible_pipelining to False 49116 1727204699.76410: Set connection var ansible_shell_type to sh 49116 1727204699.76426: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204699.76464: variable 'ansible_shell_executable' from source: unknown 49116 1727204699.76475: variable 'ansible_connection' from source: unknown 49116 1727204699.76483: variable 'ansible_module_compression' from source: unknown 49116 1727204699.76490: variable 'ansible_shell_type' from source: unknown 49116 1727204699.76497: variable 'ansible_shell_executable' from source: unknown 49116 1727204699.76503: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204699.76511: variable 'ansible_pipelining' from source: unknown 49116 1727204699.76518: variable 'ansible_timeout' from source: unknown 49116 1727204699.76527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204699.76784: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49116 1727204699.76861: variable 'omit' from source: magic vars 49116 1727204699.76864: starting attempt loop 49116 1727204699.76869: running the handler 49116 1727204699.76872: _low_level_execute_command(): starting 49116 1727204699.76874: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204699.77782: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204699.77824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204699.77848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204699.77908: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204699.78003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204699.79873: stdout chunk (state=3): >>>/root <<< 49116 1727204699.80096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204699.80100: stdout chunk (state=3): >>><<< 49116 1727204699.80103: stderr chunk (state=3): >>><<< 49116 1727204699.80130: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204699.80254: _low_level_execute_command(): starting 49116 1727204699.80260: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204699.8013713-50605-36442079479260 `" && echo ansible-tmp-1727204699.8013713-50605-36442079479260="` echo /root/.ansible/tmp/ansible-tmp-1727204699.8013713-50605-36442079479260 `" ) && sleep 0' 49116 1727204699.81052: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204699.81194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204699.81199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204699.81325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204699.81328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204699.81382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204699.81546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204699.83681: stdout chunk (state=3): >>>ansible-tmp-1727204699.8013713-50605-36442079479260=/root/.ansible/tmp/ansible-tmp-1727204699.8013713-50605-36442079479260 <<< 49116 1727204699.83895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204699.83899: stdout chunk (state=3): >>><<< 49116 1727204699.83902: stderr chunk (state=3): >>><<< 49116 1727204699.84072: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204699.8013713-50605-36442079479260=/root/.ansible/tmp/ansible-tmp-1727204699.8013713-50605-36442079479260 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204699.84076: variable 'ansible_module_compression' from source: unknown 49116 1727204699.84078: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 49116 1727204699.84103: variable 'ansible_facts' from source: unknown 49116 1727204699.84198: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204699.8013713-50605-36442079479260/AnsiballZ_stat.py 49116 1727204699.84475: Sending initial data 49116 1727204699.84479: Sent initial data (152 bytes) 49116 1727204699.86115: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204699.86120: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204699.86200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204699.86336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204699.88177: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204699.88231: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204699.88306: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpviss85hh /root/.ansible/tmp/ansible-tmp-1727204699.8013713-50605-36442079479260/AnsiballZ_stat.py <<< 49116 1727204699.88309: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204699.8013713-50605-36442079479260/AnsiballZ_stat.py" <<< 49116 1727204699.88408: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpviss85hh" to remote "/root/.ansible/tmp/ansible-tmp-1727204699.8013713-50605-36442079479260/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204699.8013713-50605-36442079479260/AnsiballZ_stat.py" <<< 49116 1727204699.89828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204699.90055: stderr chunk (state=3): >>><<< 49116 1727204699.90059: stdout chunk (state=3): >>><<< 49116 1727204699.90090: done transferring module to remote 49116 1727204699.90101: _low_level_execute_command(): starting 49116 1727204699.90106: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204699.8013713-50605-36442079479260/ /root/.ansible/tmp/ansible-tmp-1727204699.8013713-50605-36442079479260/AnsiballZ_stat.py && sleep 0' 49116 1727204699.91184: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204699.91238: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204699.91289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204699.91293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204699.91296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204699.91298: stderr chunk (state=3): >>>debug2: match not found <<< 49116 1727204699.91300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204699.91303: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 49116 1727204699.91392: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204699.91399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204699.91504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204699.93774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204699.93819: stderr chunk (state=3): >>><<< 49116 1727204699.94047: stdout chunk (state=3): >>><<< 49116 1727204699.94052: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204699.94055: _low_level_execute_command(): starting 49116 1727204699.94057: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204699.8013713-50605-36442079479260/AnsiballZ_stat.py && sleep 0' 49116 1727204699.95092: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204699.95216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204699.95239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204699.95262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204699.95290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204699.95306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204699.95438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204700.20345: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101", "follow": false, "checksum_algorithm": "sha1"}}} <<< 49116 1727204700.22060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204700.22065: stdout chunk (state=3): >>><<< 49116 1727204700.22072: stderr chunk (state=3): >>><<< 49116 1727204700.22273: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204700.22278: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr101', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204699.8013713-50605-36442079479260/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204700.22282: _low_level_execute_command(): starting 49116 1727204700.22285: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204699.8013713-50605-36442079479260/ > /dev/null 2>&1 && sleep 0' 49116 1727204700.22841: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204700.23072: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204700.23078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204700.23081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204700.23084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204700.23086: stderr chunk (state=3): >>>debug2: match not found <<< 49116 1727204700.23088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204700.23091: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 49116 1727204700.23093: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 49116 1727204700.23095: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 49116 1727204700.23098: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204700.23101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204700.23691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204700.25732: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204700.25740: stdout chunk (state=3): >>><<< 49116 1727204700.25742: stderr chunk (state=3): >>><<< 49116 1727204700.25768: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204700.25776: handler run complete 49116 1727204700.25801: attempt loop complete, returning result 49116 1727204700.25805: _execute() done 49116 1727204700.25807: dumping result to json 49116 1727204700.25810: done dumping result, returning 49116 1727204700.25820: done running TaskExecutor() for managed-node3/TASK: Stat profile file [127b8e07-fff9-02f7-957b-0000000007f1] 49116 1727204700.25825: sending task result for task 127b8e07-fff9-02f7-957b-0000000007f1 ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 49116 1727204700.26035: no more pending results, returning what we have 49116 1727204700.26039: results queue empty 49116 1727204700.26040: checking for any_errors_fatal 49116 1727204700.26046: done checking for any_errors_fatal 49116 1727204700.26047: checking for max_fail_percentage 49116 1727204700.26049: done checking for max_fail_percentage 49116 1727204700.26050: checking to see if all hosts have failed and the running result is not ok 49116 1727204700.26051: done checking to see if all hosts have failed 49116 1727204700.26052: getting the remaining hosts for this loop 49116 1727204700.26053: done getting the remaining hosts for this loop 49116 1727204700.26058: getting the next task for host managed-node3 49116 1727204700.26065: done getting next task for host managed-node3 49116 1727204700.26071: ^ task is: TASK: Set NM profile exist flag based on the profile files 49116 1727204700.26077: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204700.26083: getting variables 49116 1727204700.26085: in VariableManager get_vars() 49116 1727204700.26139: Calling all_inventory to load vars for managed-node3 49116 1727204700.26142: Calling groups_inventory to load vars for managed-node3 49116 1727204700.26145: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204700.26160: Calling all_plugins_play to load vars for managed-node3 49116 1727204700.26164: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204700.26274: Calling groups_plugins_play to load vars for managed-node3 49116 1727204700.27089: done sending task result for task 127b8e07-fff9-02f7-957b-0000000007f1 49116 1727204700.27093: WORKER PROCESS EXITING 49116 1727204700.29196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204700.31376: done with get_vars() 49116 1727204700.31417: done getting variables 49116 1727204700.31494: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:05:00 -0400 (0:00:00.575) 0:00:23.340 ***** 49116 1727204700.31532: entering _queue_task() for managed-node3/set_fact 49116 1727204700.31952: worker is 1 (out of 1 available) 49116 1727204700.31968: exiting _queue_task() for managed-node3/set_fact 49116 1727204700.31983: done queuing things up, now waiting for results queue to drain 49116 1727204700.31985: waiting for pending results... 49116 1727204700.32313: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files 49116 1727204700.32464: in run() - task 127b8e07-fff9-02f7-957b-0000000007f2 49116 1727204700.32493: variable 'ansible_search_path' from source: unknown 49116 1727204700.32502: variable 'ansible_search_path' from source: unknown 49116 1727204700.32562: calling self._execute() 49116 1727204700.32693: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204700.32708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204700.32725: variable 'omit' from source: magic vars 49116 1727204700.33164: variable 'ansible_distribution_major_version' from source: facts 49116 1727204700.33186: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204700.33338: variable 'profile_stat' from source: set_fact 49116 1727204700.33360: Evaluated conditional (profile_stat.stat.exists): False 49116 1727204700.33371: when evaluation is False, skipping this task 49116 1727204700.33379: _execute() done 49116 1727204700.33388: dumping result to json 49116 1727204700.33396: done dumping result, returning 49116 1727204700.33408: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files [127b8e07-fff9-02f7-957b-0000000007f2] 49116 1727204700.33425: sending task result for task 127b8e07-fff9-02f7-957b-0000000007f2 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 49116 1727204700.33599: no more pending results, returning what we have 49116 1727204700.33604: results queue empty 49116 1727204700.33606: checking for any_errors_fatal 49116 1727204700.33616: done checking for any_errors_fatal 49116 1727204700.33617: checking for max_fail_percentage 49116 1727204700.33619: done checking for max_fail_percentage 49116 1727204700.33621: checking to see if all hosts have failed and the running result is not ok 49116 1727204700.33621: done checking to see if all hosts have failed 49116 1727204700.33622: getting the remaining hosts for this loop 49116 1727204700.33624: done getting the remaining hosts for this loop 49116 1727204700.33629: getting the next task for host managed-node3 49116 1727204700.33640: done getting next task for host managed-node3 49116 1727204700.33643: ^ task is: TASK: Get NM profile info 49116 1727204700.33648: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204700.33654: getting variables 49116 1727204700.33656: in VariableManager get_vars() 49116 1727204700.33705: Calling all_inventory to load vars for managed-node3 49116 1727204700.33708: Calling groups_inventory to load vars for managed-node3 49116 1727204700.33710: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204700.33725: Calling all_plugins_play to load vars for managed-node3 49116 1727204700.33728: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204700.33732: Calling groups_plugins_play to load vars for managed-node3 49116 1727204700.34585: done sending task result for task 127b8e07-fff9-02f7-957b-0000000007f2 49116 1727204700.34590: WORKER PROCESS EXITING 49116 1727204700.35809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204700.37968: done with get_vars() 49116 1727204700.38006: done getting variables 49116 1727204700.38120: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:05:00 -0400 (0:00:00.066) 0:00:23.406 ***** 49116 1727204700.38159: entering _queue_task() for managed-node3/shell 49116 1727204700.38161: Creating lock for shell 49116 1727204700.38569: worker is 1 (out of 1 available) 49116 1727204700.38588: exiting _queue_task() for managed-node3/shell 49116 1727204700.38602: done queuing things up, now waiting for results queue to drain 49116 1727204700.38603: waiting for pending results... 49116 1727204700.38880: running TaskExecutor() for managed-node3/TASK: Get NM profile info 49116 1727204700.39019: in run() - task 127b8e07-fff9-02f7-957b-0000000007f3 49116 1727204700.39042: variable 'ansible_search_path' from source: unknown 49116 1727204700.39051: variable 'ansible_search_path' from source: unknown 49116 1727204700.39095: calling self._execute() 49116 1727204700.39205: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204700.39222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204700.39238: variable 'omit' from source: magic vars 49116 1727204700.39644: variable 'ansible_distribution_major_version' from source: facts 49116 1727204700.39765: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204700.39770: variable 'omit' from source: magic vars 49116 1727204700.39773: variable 'omit' from source: magic vars 49116 1727204700.39854: variable 'profile' from source: include params 49116 1727204700.39864: variable 'item' from source: include params 49116 1727204700.39940: variable 'item' from source: include params 49116 1727204700.39966: variable 'omit' from source: magic vars 49116 1727204700.40024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204700.40071: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204700.40170: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204700.40174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204700.40177: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204700.40179: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204700.40181: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204700.40185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204700.40296: Set connection var ansible_connection to ssh 49116 1727204700.40320: Set connection var ansible_timeout to 10 49116 1727204700.40333: Set connection var ansible_shell_executable to /bin/sh 49116 1727204700.40343: Set connection var ansible_pipelining to False 49116 1727204700.40349: Set connection var ansible_shell_type to sh 49116 1727204700.40358: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204700.40387: variable 'ansible_shell_executable' from source: unknown 49116 1727204700.40395: variable 'ansible_connection' from source: unknown 49116 1727204700.40402: variable 'ansible_module_compression' from source: unknown 49116 1727204700.40408: variable 'ansible_shell_type' from source: unknown 49116 1727204700.40423: variable 'ansible_shell_executable' from source: unknown 49116 1727204700.40471: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204700.40474: variable 'ansible_pipelining' from source: unknown 49116 1727204700.40477: variable 'ansible_timeout' from source: unknown 49116 1727204700.40479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204700.40609: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204700.40626: variable 'omit' from source: magic vars 49116 1727204700.40642: starting attempt loop 49116 1727204700.40649: running the handler 49116 1727204700.40663: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204700.40690: _low_level_execute_command(): starting 49116 1727204700.40743: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204700.41558: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204700.41638: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204700.41704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204700.41734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204700.41758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204700.41862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204700.43727: stdout chunk (state=3): >>>/root <<< 49116 1727204700.43892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204700.43944: stderr chunk (state=3): >>><<< 49116 1727204700.43953: stdout chunk (state=3): >>><<< 49116 1727204700.43985: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204700.44012: _low_level_execute_command(): starting 49116 1727204700.44023: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204700.4399865-50632-279107830279990 `" && echo ansible-tmp-1727204700.4399865-50632-279107830279990="` echo /root/.ansible/tmp/ansible-tmp-1727204700.4399865-50632-279107830279990 `" ) && sleep 0' 49116 1727204700.44785: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204700.44915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204700.44951: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204700.44974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204700.45015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204700.45160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204700.47360: stdout chunk (state=3): >>>ansible-tmp-1727204700.4399865-50632-279107830279990=/root/.ansible/tmp/ansible-tmp-1727204700.4399865-50632-279107830279990 <<< 49116 1727204700.47570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204700.47584: stderr chunk (state=3): >>><<< 49116 1727204700.47593: stdout chunk (state=3): >>><<< 49116 1727204700.47619: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204700.4399865-50632-279107830279990=/root/.ansible/tmp/ansible-tmp-1727204700.4399865-50632-279107830279990 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204700.47675: variable 'ansible_module_compression' from source: unknown 49116 1727204700.47770: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 49116 1727204700.47774: variable 'ansible_facts' from source: unknown 49116 1727204700.47864: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204700.4399865-50632-279107830279990/AnsiballZ_command.py 49116 1727204700.48020: Sending initial data 49116 1727204700.48081: Sent initial data (156 bytes) 49116 1727204700.48723: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204700.48738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204700.48754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204700.48773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204700.48829: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204700.48893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204700.48910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204700.48941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204700.49055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204700.50876: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204700.50996: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204700.51099: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpcioqck1h /root/.ansible/tmp/ansible-tmp-1727204700.4399865-50632-279107830279990/AnsiballZ_command.py <<< 49116 1727204700.51115: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204700.4399865-50632-279107830279990/AnsiballZ_command.py" <<< 49116 1727204700.51183: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpcioqck1h" to remote "/root/.ansible/tmp/ansible-tmp-1727204700.4399865-50632-279107830279990/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204700.4399865-50632-279107830279990/AnsiballZ_command.py" <<< 49116 1727204700.52239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204700.52365: stderr chunk (state=3): >>><<< 49116 1727204700.52370: stdout chunk (state=3): >>><<< 49116 1727204700.52373: done transferring module to remote 49116 1727204700.52375: _low_level_execute_command(): starting 49116 1727204700.52378: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204700.4399865-50632-279107830279990/ /root/.ansible/tmp/ansible-tmp-1727204700.4399865-50632-279107830279990/AnsiballZ_command.py && sleep 0' 49116 1727204700.53159: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204700.53227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204700.53257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204700.53305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204700.53390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204700.55673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204700.55678: stdout chunk (state=3): >>><<< 49116 1727204700.55681: stderr chunk (state=3): >>><<< 49116 1727204700.55684: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204700.55686: _low_level_execute_command(): starting 49116 1727204700.55689: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204700.4399865-50632-279107830279990/AnsiballZ_command.py && sleep 0' 49116 1727204700.56298: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204700.56308: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204700.56319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204700.56342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204700.56448: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204700.56471: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204700.56588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204700.76548: stdout chunk (state=3): >>> {"changed": true, "stdout": "lsr101 /etc/NetworkManager/system-connections/lsr101.nmconnection \nlsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "start": "2024-09-24 15:05:00.741942", "end": "2024-09-24 15:05:00.764072", "delta": "0:00:00.022130", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49116 1727204700.78707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204700.78712: stdout chunk (state=3): >>><<< 49116 1727204700.78714: stderr chunk (state=3): >>><<< 49116 1727204700.78717: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "lsr101 /etc/NetworkManager/system-connections/lsr101.nmconnection \nlsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "start": "2024-09-24 15:05:00.741942", "end": "2024-09-24 15:05:00.764072", "delta": "0:00:00.022130", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204700.78720: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204700.4399865-50632-279107830279990/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204700.78722: _low_level_execute_command(): starting 49116 1727204700.78725: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204700.4399865-50632-279107830279990/ > /dev/null 2>&1 && sleep 0' 49116 1727204700.79364: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204700.79372: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204700.79387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204700.79401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204700.79413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204700.79420: stderr chunk (state=3): >>>debug2: match not found <<< 49116 1727204700.79429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204700.79445: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 49116 1727204700.79452: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 49116 1727204700.79460: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 49116 1727204700.79470: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204700.79480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204700.79494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204700.79501: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204700.79508: stderr chunk (state=3): >>>debug2: match found <<< 49116 1727204700.79518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204700.79589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204700.79647: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204700.79650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204700.79723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204700.81871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204700.81876: stdout chunk (state=3): >>><<< 49116 1727204700.81879: stderr chunk (state=3): >>><<< 49116 1727204700.81900: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204700.81915: handler run complete 49116 1727204700.81974: Evaluated conditional (False): False 49116 1727204700.81977: attempt loop complete, returning result 49116 1727204700.81980: _execute() done 49116 1727204700.81986: dumping result to json 49116 1727204700.81995: done dumping result, returning 49116 1727204700.82044: done running TaskExecutor() for managed-node3/TASK: Get NM profile info [127b8e07-fff9-02f7-957b-0000000007f3] 49116 1727204700.82048: sending task result for task 127b8e07-fff9-02f7-957b-0000000007f3 ok: [managed-node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "delta": "0:00:00.022130", "end": "2024-09-24 15:05:00.764072", "rc": 0, "start": "2024-09-24 15:05:00.741942" } STDOUT: lsr101 /etc/NetworkManager/system-connections/lsr101.nmconnection lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection 49116 1727204700.82456: no more pending results, returning what we have 49116 1727204700.82461: results queue empty 49116 1727204700.82462: checking for any_errors_fatal 49116 1727204700.82475: done checking for any_errors_fatal 49116 1727204700.82481: checking for max_fail_percentage 49116 1727204700.82483: done checking for max_fail_percentage 49116 1727204700.82484: checking to see if all hosts have failed and the running result is not ok 49116 1727204700.82485: done checking to see if all hosts have failed 49116 1727204700.82486: getting the remaining hosts for this loop 49116 1727204700.82488: done getting the remaining hosts for this loop 49116 1727204700.82494: getting the next task for host managed-node3 49116 1727204700.82502: done getting next task for host managed-node3 49116 1727204700.82505: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 49116 1727204700.82511: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204700.82515: getting variables 49116 1727204700.82517: in VariableManager get_vars() 49116 1727204700.82574: Calling all_inventory to load vars for managed-node3 49116 1727204700.82578: Calling groups_inventory to load vars for managed-node3 49116 1727204700.82580: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204700.82707: done sending task result for task 127b8e07-fff9-02f7-957b-0000000007f3 49116 1727204700.82711: WORKER PROCESS EXITING 49116 1727204700.82723: Calling all_plugins_play to load vars for managed-node3 49116 1727204700.82727: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204700.82731: Calling groups_plugins_play to load vars for managed-node3 49116 1727204700.84694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204700.86966: done with get_vars() 49116 1727204700.87006: done getting variables 49116 1727204700.87083: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:05:00 -0400 (0:00:00.489) 0:00:23.896 ***** 49116 1727204700.87119: entering _queue_task() for managed-node3/set_fact 49116 1727204700.87523: worker is 1 (out of 1 available) 49116 1727204700.87538: exiting _queue_task() for managed-node3/set_fact 49116 1727204700.87552: done queuing things up, now waiting for results queue to drain 49116 1727204700.87553: waiting for pending results... 49116 1727204700.87888: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 49116 1727204700.88032: in run() - task 127b8e07-fff9-02f7-957b-0000000007f4 49116 1727204700.88057: variable 'ansible_search_path' from source: unknown 49116 1727204700.88067: variable 'ansible_search_path' from source: unknown 49116 1727204700.88112: calling self._execute() 49116 1727204700.88231: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204700.88250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204700.88269: variable 'omit' from source: magic vars 49116 1727204700.88712: variable 'ansible_distribution_major_version' from source: facts 49116 1727204700.88730: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204700.88863: variable 'nm_profile_exists' from source: set_fact 49116 1727204700.88887: Evaluated conditional (nm_profile_exists.rc == 0): True 49116 1727204700.88904: variable 'omit' from source: magic vars 49116 1727204700.88956: variable 'omit' from source: magic vars 49116 1727204700.88996: variable 'omit' from source: magic vars 49116 1727204700.89062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204700.89114: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204700.89149: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204700.89178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204700.89198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204700.89246: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204700.89255: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204700.89263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204700.89400: Set connection var ansible_connection to ssh 49116 1727204700.89421: Set connection var ansible_timeout to 10 49116 1727204700.89434: Set connection var ansible_shell_executable to /bin/sh 49116 1727204700.89452: Set connection var ansible_pipelining to False 49116 1727204700.89460: Set connection var ansible_shell_type to sh 49116 1727204700.89473: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204700.89505: variable 'ansible_shell_executable' from source: unknown 49116 1727204700.89514: variable 'ansible_connection' from source: unknown 49116 1727204700.89521: variable 'ansible_module_compression' from source: unknown 49116 1727204700.89528: variable 'ansible_shell_type' from source: unknown 49116 1727204700.89534: variable 'ansible_shell_executable' from source: unknown 49116 1727204700.89558: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204700.89561: variable 'ansible_pipelining' from source: unknown 49116 1727204700.89563: variable 'ansible_timeout' from source: unknown 49116 1727204700.89664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204700.89743: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204700.89764: variable 'omit' from source: magic vars 49116 1727204700.89785: starting attempt loop 49116 1727204700.89794: running the handler 49116 1727204700.89813: handler run complete 49116 1727204700.89828: attempt loop complete, returning result 49116 1727204700.89835: _execute() done 49116 1727204700.89842: dumping result to json 49116 1727204700.89850: done dumping result, returning 49116 1727204700.89863: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [127b8e07-fff9-02f7-957b-0000000007f4] 49116 1727204700.89875: sending task result for task 127b8e07-fff9-02f7-957b-0000000007f4 49116 1727204700.90139: done sending task result for task 127b8e07-fff9-02f7-957b-0000000007f4 49116 1727204700.90142: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 49116 1727204700.90215: no more pending results, returning what we have 49116 1727204700.90219: results queue empty 49116 1727204700.90220: checking for any_errors_fatal 49116 1727204700.90229: done checking for any_errors_fatal 49116 1727204700.90230: checking for max_fail_percentage 49116 1727204700.90232: done checking for max_fail_percentage 49116 1727204700.90233: checking to see if all hosts have failed and the running result is not ok 49116 1727204700.90234: done checking to see if all hosts have failed 49116 1727204700.90235: getting the remaining hosts for this loop 49116 1727204700.90237: done getting the remaining hosts for this loop 49116 1727204700.90242: getting the next task for host managed-node3 49116 1727204700.90255: done getting next task for host managed-node3 49116 1727204700.90259: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 49116 1727204700.90264: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204700.90274: getting variables 49116 1727204700.90276: in VariableManager get_vars() 49116 1727204700.90326: Calling all_inventory to load vars for managed-node3 49116 1727204700.90330: Calling groups_inventory to load vars for managed-node3 49116 1727204700.90332: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204700.90347: Calling all_plugins_play to load vars for managed-node3 49116 1727204700.90350: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204700.90354: Calling groups_plugins_play to load vars for managed-node3 49116 1727204700.92534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204701.00177: done with get_vars() 49116 1727204701.00217: done getting variables 49116 1727204701.00277: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204701.00389: variable 'profile' from source: include params 49116 1727204701.00392: variable 'item' from source: include params 49116 1727204701.00461: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-lsr101] ************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.133) 0:00:24.029 ***** 49116 1727204701.00498: entering _queue_task() for managed-node3/command 49116 1727204701.00920: worker is 1 (out of 1 available) 49116 1727204701.00935: exiting _queue_task() for managed-node3/command 49116 1727204701.00949: done queuing things up, now waiting for results queue to drain 49116 1727204701.00951: waiting for pending results... 49116 1727204701.01192: running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-lsr101 49116 1727204701.01473: in run() - task 127b8e07-fff9-02f7-957b-0000000007f6 49116 1727204701.01478: variable 'ansible_search_path' from source: unknown 49116 1727204701.01481: variable 'ansible_search_path' from source: unknown 49116 1727204701.01484: calling self._execute() 49116 1727204701.01519: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204701.01530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204701.01549: variable 'omit' from source: magic vars 49116 1727204701.02096: variable 'ansible_distribution_major_version' from source: facts 49116 1727204701.02117: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204701.02291: variable 'profile_stat' from source: set_fact 49116 1727204701.02314: Evaluated conditional (profile_stat.stat.exists): False 49116 1727204701.02323: when evaluation is False, skipping this task 49116 1727204701.02331: _execute() done 49116 1727204701.02338: dumping result to json 49116 1727204701.02345: done dumping result, returning 49116 1727204701.02356: done running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-lsr101 [127b8e07-fff9-02f7-957b-0000000007f6] 49116 1727204701.02375: sending task result for task 127b8e07-fff9-02f7-957b-0000000007f6 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 49116 1727204701.02685: no more pending results, returning what we have 49116 1727204701.02690: results queue empty 49116 1727204701.02691: checking for any_errors_fatal 49116 1727204701.02700: done checking for any_errors_fatal 49116 1727204701.02701: checking for max_fail_percentage 49116 1727204701.02703: done checking for max_fail_percentage 49116 1727204701.02706: checking to see if all hosts have failed and the running result is not ok 49116 1727204701.02714: done checking to see if all hosts have failed 49116 1727204701.02715: getting the remaining hosts for this loop 49116 1727204701.02717: done getting the remaining hosts for this loop 49116 1727204701.02723: getting the next task for host managed-node3 49116 1727204701.02733: done getting next task for host managed-node3 49116 1727204701.02737: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 49116 1727204701.02742: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204701.02749: getting variables 49116 1727204701.02751: in VariableManager get_vars() 49116 1727204701.02809: Calling all_inventory to load vars for managed-node3 49116 1727204701.02813: Calling groups_inventory to load vars for managed-node3 49116 1727204701.02937: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204701.02946: done sending task result for task 127b8e07-fff9-02f7-957b-0000000007f6 49116 1727204701.02949: WORKER PROCESS EXITING 49116 1727204701.02964: Calling all_plugins_play to load vars for managed-node3 49116 1727204701.02969: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204701.02973: Calling groups_plugins_play to load vars for managed-node3 49116 1727204701.04999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204701.07383: done with get_vars() 49116 1727204701.07432: done getting variables 49116 1727204701.07509: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204701.07653: variable 'profile' from source: include params 49116 1727204701.07657: variable 'item' from source: include params 49116 1727204701.07728: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-lsr101] ********************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.072) 0:00:24.102 ***** 49116 1727204701.07768: entering _queue_task() for managed-node3/set_fact 49116 1727204701.08195: worker is 1 (out of 1 available) 49116 1727204701.08210: exiting _queue_task() for managed-node3/set_fact 49116 1727204701.08225: done queuing things up, now waiting for results queue to drain 49116 1727204701.08227: waiting for pending results... 49116 1727204701.08618: running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-lsr101 49116 1727204701.08714: in run() - task 127b8e07-fff9-02f7-957b-0000000007f7 49116 1727204701.08719: variable 'ansible_search_path' from source: unknown 49116 1727204701.08722: variable 'ansible_search_path' from source: unknown 49116 1727204701.08823: calling self._execute() 49116 1727204701.08899: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204701.08913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204701.08935: variable 'omit' from source: magic vars 49116 1727204701.09409: variable 'ansible_distribution_major_version' from source: facts 49116 1727204701.09428: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204701.09574: variable 'profile_stat' from source: set_fact 49116 1727204701.09606: Evaluated conditional (profile_stat.stat.exists): False 49116 1727204701.09695: when evaluation is False, skipping this task 49116 1727204701.09699: _execute() done 49116 1727204701.09704: dumping result to json 49116 1727204701.09707: done dumping result, returning 49116 1727204701.09710: done running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-lsr101 [127b8e07-fff9-02f7-957b-0000000007f7] 49116 1727204701.09713: sending task result for task 127b8e07-fff9-02f7-957b-0000000007f7 49116 1727204701.09910: done sending task result for task 127b8e07-fff9-02f7-957b-0000000007f7 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 49116 1727204701.09970: no more pending results, returning what we have 49116 1727204701.09975: results queue empty 49116 1727204701.09976: checking for any_errors_fatal 49116 1727204701.09985: done checking for any_errors_fatal 49116 1727204701.09985: checking for max_fail_percentage 49116 1727204701.09988: done checking for max_fail_percentage 49116 1727204701.09989: checking to see if all hosts have failed and the running result is not ok 49116 1727204701.09990: done checking to see if all hosts have failed 49116 1727204701.09991: getting the remaining hosts for this loop 49116 1727204701.09992: done getting the remaining hosts for this loop 49116 1727204701.09997: getting the next task for host managed-node3 49116 1727204701.10006: done getting next task for host managed-node3 49116 1727204701.10010: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 49116 1727204701.10022: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204701.10028: getting variables 49116 1727204701.10030: in VariableManager get_vars() 49116 1727204701.10149: Calling all_inventory to load vars for managed-node3 49116 1727204701.10152: Calling groups_inventory to load vars for managed-node3 49116 1727204701.10155: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204701.10174: Calling all_plugins_play to load vars for managed-node3 49116 1727204701.10177: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204701.10181: Calling groups_plugins_play to load vars for managed-node3 49116 1727204701.10864: WORKER PROCESS EXITING 49116 1727204701.12435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204701.14790: done with get_vars() 49116 1727204701.14842: done getting variables 49116 1727204701.14914: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204701.15060: variable 'profile' from source: include params 49116 1727204701.15064: variable 'item' from source: include params 49116 1727204701.15131: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-lsr101] ***************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.074) 0:00:24.176 ***** 49116 1727204701.15175: entering _queue_task() for managed-node3/command 49116 1727204701.15606: worker is 1 (out of 1 available) 49116 1727204701.15621: exiting _queue_task() for managed-node3/command 49116 1727204701.15636: done queuing things up, now waiting for results queue to drain 49116 1727204701.15638: waiting for pending results... 49116 1727204701.15864: running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-lsr101 49116 1727204701.15959: in run() - task 127b8e07-fff9-02f7-957b-0000000007f8 49116 1727204701.15973: variable 'ansible_search_path' from source: unknown 49116 1727204701.15978: variable 'ansible_search_path' from source: unknown 49116 1727204701.16012: calling self._execute() 49116 1727204701.16107: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204701.16113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204701.16125: variable 'omit' from source: magic vars 49116 1727204701.16448: variable 'ansible_distribution_major_version' from source: facts 49116 1727204701.16460: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204701.16555: variable 'profile_stat' from source: set_fact 49116 1727204701.16568: Evaluated conditional (profile_stat.stat.exists): False 49116 1727204701.16574: when evaluation is False, skipping this task 49116 1727204701.16577: _execute() done 49116 1727204701.16580: dumping result to json 49116 1727204701.16582: done dumping result, returning 49116 1727204701.16590: done running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-lsr101 [127b8e07-fff9-02f7-957b-0000000007f8] 49116 1727204701.16595: sending task result for task 127b8e07-fff9-02f7-957b-0000000007f8 49116 1727204701.16696: done sending task result for task 127b8e07-fff9-02f7-957b-0000000007f8 49116 1727204701.16699: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 49116 1727204701.16755: no more pending results, returning what we have 49116 1727204701.16759: results queue empty 49116 1727204701.16760: checking for any_errors_fatal 49116 1727204701.16771: done checking for any_errors_fatal 49116 1727204701.16772: checking for max_fail_percentage 49116 1727204701.16774: done checking for max_fail_percentage 49116 1727204701.16775: checking to see if all hosts have failed and the running result is not ok 49116 1727204701.16776: done checking to see if all hosts have failed 49116 1727204701.16776: getting the remaining hosts for this loop 49116 1727204701.16778: done getting the remaining hosts for this loop 49116 1727204701.16783: getting the next task for host managed-node3 49116 1727204701.16791: done getting next task for host managed-node3 49116 1727204701.16794: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 49116 1727204701.16799: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204701.16804: getting variables 49116 1727204701.16806: in VariableManager get_vars() 49116 1727204701.16853: Calling all_inventory to load vars for managed-node3 49116 1727204701.16855: Calling groups_inventory to load vars for managed-node3 49116 1727204701.16858: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204701.16877: Calling all_plugins_play to load vars for managed-node3 49116 1727204701.16880: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204701.16883: Calling groups_plugins_play to load vars for managed-node3 49116 1727204701.18327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204701.19829: done with get_vars() 49116 1727204701.19853: done getting variables 49116 1727204701.19906: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204701.20003: variable 'profile' from source: include params 49116 1727204701.20007: variable 'item' from source: include params 49116 1727204701.20052: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-lsr101] ************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.049) 0:00:24.225 ***** 49116 1727204701.20080: entering _queue_task() for managed-node3/set_fact 49116 1727204701.20373: worker is 1 (out of 1 available) 49116 1727204701.20387: exiting _queue_task() for managed-node3/set_fact 49116 1727204701.20401: done queuing things up, now waiting for results queue to drain 49116 1727204701.20402: waiting for pending results... 49116 1727204701.20616: running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-lsr101 49116 1727204701.20720: in run() - task 127b8e07-fff9-02f7-957b-0000000007f9 49116 1727204701.20777: variable 'ansible_search_path' from source: unknown 49116 1727204701.20780: variable 'ansible_search_path' from source: unknown 49116 1727204701.20783: calling self._execute() 49116 1727204701.20986: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204701.20990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204701.20993: variable 'omit' from source: magic vars 49116 1727204701.21371: variable 'ansible_distribution_major_version' from source: facts 49116 1727204701.21375: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204701.21460: variable 'profile_stat' from source: set_fact 49116 1727204701.21478: Evaluated conditional (profile_stat.stat.exists): False 49116 1727204701.21482: when evaluation is False, skipping this task 49116 1727204701.21485: _execute() done 49116 1727204701.21488: dumping result to json 49116 1727204701.21490: done dumping result, returning 49116 1727204701.21498: done running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-lsr101 [127b8e07-fff9-02f7-957b-0000000007f9] 49116 1727204701.21503: sending task result for task 127b8e07-fff9-02f7-957b-0000000007f9 49116 1727204701.21677: done sending task result for task 127b8e07-fff9-02f7-957b-0000000007f9 49116 1727204701.21680: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 49116 1727204701.21773: no more pending results, returning what we have 49116 1727204701.21777: results queue empty 49116 1727204701.21778: checking for any_errors_fatal 49116 1727204701.21783: done checking for any_errors_fatal 49116 1727204701.21784: checking for max_fail_percentage 49116 1727204701.21786: done checking for max_fail_percentage 49116 1727204701.21787: checking to see if all hosts have failed and the running result is not ok 49116 1727204701.21788: done checking to see if all hosts have failed 49116 1727204701.21788: getting the remaining hosts for this loop 49116 1727204701.21789: done getting the remaining hosts for this loop 49116 1727204701.21793: getting the next task for host managed-node3 49116 1727204701.21801: done getting next task for host managed-node3 49116 1727204701.21804: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 49116 1727204701.21807: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204701.21810: getting variables 49116 1727204701.21812: in VariableManager get_vars() 49116 1727204701.21886: Calling all_inventory to load vars for managed-node3 49116 1727204701.21889: Calling groups_inventory to load vars for managed-node3 49116 1727204701.21891: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204701.21901: Calling all_plugins_play to load vars for managed-node3 49116 1727204701.21904: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204701.21908: Calling groups_plugins_play to load vars for managed-node3 49116 1727204701.23380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204701.24615: done with get_vars() 49116 1727204701.24655: done getting variables 49116 1727204701.24732: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204701.24855: variable 'profile' from source: include params 49116 1727204701.24859: variable 'item' from source: include params 49116 1727204701.24922: variable 'item' from source: include params TASK [Assert that the profile is present - 'lsr101'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.048) 0:00:24.274 ***** 49116 1727204701.24962: entering _queue_task() for managed-node3/assert 49116 1727204701.25581: worker is 1 (out of 1 available) 49116 1727204701.25592: exiting _queue_task() for managed-node3/assert 49116 1727204701.25605: done queuing things up, now waiting for results queue to drain 49116 1727204701.25607: waiting for pending results... 49116 1727204701.25786: running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'lsr101' 49116 1727204701.25843: in run() - task 127b8e07-fff9-02f7-957b-0000000006b9 49116 1727204701.25946: variable 'ansible_search_path' from source: unknown 49116 1727204701.25950: variable 'ansible_search_path' from source: unknown 49116 1727204701.25954: calling self._execute() 49116 1727204701.26025: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204701.26031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204701.26052: variable 'omit' from source: magic vars 49116 1727204701.26429: variable 'ansible_distribution_major_version' from source: facts 49116 1727204701.26440: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204701.26450: variable 'omit' from source: magic vars 49116 1727204701.26495: variable 'omit' from source: magic vars 49116 1727204701.26584: variable 'profile' from source: include params 49116 1727204701.26590: variable 'item' from source: include params 49116 1727204701.26749: variable 'item' from source: include params 49116 1727204701.26753: variable 'omit' from source: magic vars 49116 1727204701.26756: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204701.26760: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204701.26795: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204701.26799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204701.26812: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204701.26857: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204701.26860: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204701.26863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204701.26961: Set connection var ansible_connection to ssh 49116 1727204701.26974: Set connection var ansible_timeout to 10 49116 1727204701.26982: Set connection var ansible_shell_executable to /bin/sh 49116 1727204701.26988: Set connection var ansible_pipelining to False 49116 1727204701.26991: Set connection var ansible_shell_type to sh 49116 1727204701.27041: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204701.27044: variable 'ansible_shell_executable' from source: unknown 49116 1727204701.27048: variable 'ansible_connection' from source: unknown 49116 1727204701.27051: variable 'ansible_module_compression' from source: unknown 49116 1727204701.27052: variable 'ansible_shell_type' from source: unknown 49116 1727204701.27054: variable 'ansible_shell_executable' from source: unknown 49116 1727204701.27056: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204701.27058: variable 'ansible_pipelining' from source: unknown 49116 1727204701.27174: variable 'ansible_timeout' from source: unknown 49116 1727204701.27177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204701.27278: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204701.27305: variable 'omit' from source: magic vars 49116 1727204701.27316: starting attempt loop 49116 1727204701.27323: running the handler 49116 1727204701.27473: variable 'lsr_net_profile_exists' from source: set_fact 49116 1727204701.27484: Evaluated conditional (lsr_net_profile_exists): True 49116 1727204701.27503: handler run complete 49116 1727204701.27613: attempt loop complete, returning result 49116 1727204701.27619: _execute() done 49116 1727204701.27623: dumping result to json 49116 1727204701.27626: done dumping result, returning 49116 1727204701.27628: done running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'lsr101' [127b8e07-fff9-02f7-957b-0000000006b9] 49116 1727204701.27630: sending task result for task 127b8e07-fff9-02f7-957b-0000000006b9 49116 1727204701.27708: done sending task result for task 127b8e07-fff9-02f7-957b-0000000006b9 49116 1727204701.27712: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 49116 1727204701.27782: no more pending results, returning what we have 49116 1727204701.27787: results queue empty 49116 1727204701.27788: checking for any_errors_fatal 49116 1727204701.27798: done checking for any_errors_fatal 49116 1727204701.27799: checking for max_fail_percentage 49116 1727204701.27801: done checking for max_fail_percentage 49116 1727204701.27803: checking to see if all hosts have failed and the running result is not ok 49116 1727204701.27804: done checking to see if all hosts have failed 49116 1727204701.27805: getting the remaining hosts for this loop 49116 1727204701.27806: done getting the remaining hosts for this loop 49116 1727204701.27810: getting the next task for host managed-node3 49116 1727204701.27818: done getting next task for host managed-node3 49116 1727204701.27821: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 49116 1727204701.27825: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204701.27829: getting variables 49116 1727204701.27831: in VariableManager get_vars() 49116 1727204701.27888: Calling all_inventory to load vars for managed-node3 49116 1727204701.27891: Calling groups_inventory to load vars for managed-node3 49116 1727204701.27893: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204701.27906: Calling all_plugins_play to load vars for managed-node3 49116 1727204701.27908: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204701.27911: Calling groups_plugins_play to load vars for managed-node3 49116 1727204701.29931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204701.32784: done with get_vars() 49116 1727204701.32835: done getting variables 49116 1727204701.32924: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204701.33075: variable 'profile' from source: include params 49116 1727204701.33079: variable 'item' from source: include params 49116 1727204701.33155: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'lsr101'] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.082) 0:00:24.356 ***** 49116 1727204701.33200: entering _queue_task() for managed-node3/assert 49116 1727204701.33782: worker is 1 (out of 1 available) 49116 1727204701.33795: exiting _queue_task() for managed-node3/assert 49116 1727204701.33808: done queuing things up, now waiting for results queue to drain 49116 1727204701.33809: waiting for pending results... 49116 1727204701.33929: running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'lsr101' 49116 1727204701.34014: in run() - task 127b8e07-fff9-02f7-957b-0000000006ba 49116 1727204701.34027: variable 'ansible_search_path' from source: unknown 49116 1727204701.34030: variable 'ansible_search_path' from source: unknown 49116 1727204701.34074: calling self._execute() 49116 1727204701.34174: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204701.34178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204701.34182: variable 'omit' from source: magic vars 49116 1727204701.34500: variable 'ansible_distribution_major_version' from source: facts 49116 1727204701.34511: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204701.34517: variable 'omit' from source: magic vars 49116 1727204701.34552: variable 'omit' from source: magic vars 49116 1727204701.34637: variable 'profile' from source: include params 49116 1727204701.34641: variable 'item' from source: include params 49116 1727204701.34688: variable 'item' from source: include params 49116 1727204701.34706: variable 'omit' from source: magic vars 49116 1727204701.34743: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204701.34777: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204701.34793: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204701.34811: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204701.34822: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204701.34851: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204701.34855: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204701.34857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204701.34931: Set connection var ansible_connection to ssh 49116 1727204701.34942: Set connection var ansible_timeout to 10 49116 1727204701.34950: Set connection var ansible_shell_executable to /bin/sh 49116 1727204701.34956: Set connection var ansible_pipelining to False 49116 1727204701.34959: Set connection var ansible_shell_type to sh 49116 1727204701.34964: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204701.34984: variable 'ansible_shell_executable' from source: unknown 49116 1727204701.34987: variable 'ansible_connection' from source: unknown 49116 1727204701.34989: variable 'ansible_module_compression' from source: unknown 49116 1727204701.34992: variable 'ansible_shell_type' from source: unknown 49116 1727204701.34996: variable 'ansible_shell_executable' from source: unknown 49116 1727204701.34998: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204701.35003: variable 'ansible_pipelining' from source: unknown 49116 1727204701.35005: variable 'ansible_timeout' from source: unknown 49116 1727204701.35010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204701.35123: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204701.35137: variable 'omit' from source: magic vars 49116 1727204701.35140: starting attempt loop 49116 1727204701.35143: running the handler 49116 1727204701.35227: variable 'lsr_net_profile_ansible_managed' from source: set_fact 49116 1727204701.35230: Evaluated conditional (lsr_net_profile_ansible_managed): True 49116 1727204701.35241: handler run complete 49116 1727204701.35254: attempt loop complete, returning result 49116 1727204701.35257: _execute() done 49116 1727204701.35260: dumping result to json 49116 1727204701.35263: done dumping result, returning 49116 1727204701.35269: done running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'lsr101' [127b8e07-fff9-02f7-957b-0000000006ba] 49116 1727204701.35276: sending task result for task 127b8e07-fff9-02f7-957b-0000000006ba 49116 1727204701.35370: done sending task result for task 127b8e07-fff9-02f7-957b-0000000006ba 49116 1727204701.35372: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 49116 1727204701.35429: no more pending results, returning what we have 49116 1727204701.35435: results queue empty 49116 1727204701.35436: checking for any_errors_fatal 49116 1727204701.35443: done checking for any_errors_fatal 49116 1727204701.35444: checking for max_fail_percentage 49116 1727204701.35447: done checking for max_fail_percentage 49116 1727204701.35448: checking to see if all hosts have failed and the running result is not ok 49116 1727204701.35449: done checking to see if all hosts have failed 49116 1727204701.35449: getting the remaining hosts for this loop 49116 1727204701.35451: done getting the remaining hosts for this loop 49116 1727204701.35455: getting the next task for host managed-node3 49116 1727204701.35462: done getting next task for host managed-node3 49116 1727204701.35467: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 49116 1727204701.35471: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204701.35475: getting variables 49116 1727204701.35477: in VariableManager get_vars() 49116 1727204701.35522: Calling all_inventory to load vars for managed-node3 49116 1727204701.35525: Calling groups_inventory to load vars for managed-node3 49116 1727204701.35527: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204701.35541: Calling all_plugins_play to load vars for managed-node3 49116 1727204701.35544: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204701.35547: Calling groups_plugins_play to load vars for managed-node3 49116 1727204701.36612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204701.37957: done with get_vars() 49116 1727204701.37984: done getting variables 49116 1727204701.38038: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204701.38136: variable 'profile' from source: include params 49116 1727204701.38139: variable 'item' from source: include params 49116 1727204701.38187: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in lsr101] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.050) 0:00:24.407 ***** 49116 1727204701.38216: entering _queue_task() for managed-node3/assert 49116 1727204701.38509: worker is 1 (out of 1 available) 49116 1727204701.38524: exiting _queue_task() for managed-node3/assert 49116 1727204701.38541: done queuing things up, now waiting for results queue to drain 49116 1727204701.38542: waiting for pending results... 49116 1727204701.38743: running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in lsr101 49116 1727204701.38818: in run() - task 127b8e07-fff9-02f7-957b-0000000006bb 49116 1727204701.38830: variable 'ansible_search_path' from source: unknown 49116 1727204701.38836: variable 'ansible_search_path' from source: unknown 49116 1727204701.38870: calling self._execute() 49116 1727204701.38960: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204701.38964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204701.38976: variable 'omit' from source: magic vars 49116 1727204701.39294: variable 'ansible_distribution_major_version' from source: facts 49116 1727204701.39304: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204701.39313: variable 'omit' from source: magic vars 49116 1727204701.39350: variable 'omit' from source: magic vars 49116 1727204701.39431: variable 'profile' from source: include params 49116 1727204701.39439: variable 'item' from source: include params 49116 1727204701.39490: variable 'item' from source: include params 49116 1727204701.39506: variable 'omit' from source: magic vars 49116 1727204701.39544: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204701.39580: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204701.39598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204701.39613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204701.39625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204701.39653: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204701.39658: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204701.39662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204701.39738: Set connection var ansible_connection to ssh 49116 1727204701.39748: Set connection var ansible_timeout to 10 49116 1727204701.39756: Set connection var ansible_shell_executable to /bin/sh 49116 1727204701.39762: Set connection var ansible_pipelining to False 49116 1727204701.39765: Set connection var ansible_shell_type to sh 49116 1727204701.39773: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204701.39795: variable 'ansible_shell_executable' from source: unknown 49116 1727204701.39798: variable 'ansible_connection' from source: unknown 49116 1727204701.39801: variable 'ansible_module_compression' from source: unknown 49116 1727204701.39804: variable 'ansible_shell_type' from source: unknown 49116 1727204701.39806: variable 'ansible_shell_executable' from source: unknown 49116 1727204701.39808: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204701.39811: variable 'ansible_pipelining' from source: unknown 49116 1727204701.39814: variable 'ansible_timeout' from source: unknown 49116 1727204701.39819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204701.39941: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204701.39948: variable 'omit' from source: magic vars 49116 1727204701.39954: starting attempt loop 49116 1727204701.39957: running the handler 49116 1727204701.40050: variable 'lsr_net_profile_fingerprint' from source: set_fact 49116 1727204701.40054: Evaluated conditional (lsr_net_profile_fingerprint): True 49116 1727204701.40061: handler run complete 49116 1727204701.40078: attempt loop complete, returning result 49116 1727204701.40081: _execute() done 49116 1727204701.40084: dumping result to json 49116 1727204701.40087: done dumping result, returning 49116 1727204701.40093: done running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in lsr101 [127b8e07-fff9-02f7-957b-0000000006bb] 49116 1727204701.40099: sending task result for task 127b8e07-fff9-02f7-957b-0000000006bb 49116 1727204701.40196: done sending task result for task 127b8e07-fff9-02f7-957b-0000000006bb 49116 1727204701.40199: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 49116 1727204701.40255: no more pending results, returning what we have 49116 1727204701.40259: results queue empty 49116 1727204701.40260: checking for any_errors_fatal 49116 1727204701.40270: done checking for any_errors_fatal 49116 1727204701.40270: checking for max_fail_percentage 49116 1727204701.40273: done checking for max_fail_percentage 49116 1727204701.40274: checking to see if all hosts have failed and the running result is not ok 49116 1727204701.40275: done checking to see if all hosts have failed 49116 1727204701.40276: getting the remaining hosts for this loop 49116 1727204701.40277: done getting the remaining hosts for this loop 49116 1727204701.40281: getting the next task for host managed-node3 49116 1727204701.40290: done getting next task for host managed-node3 49116 1727204701.40293: ^ task is: TASK: Include the task 'get_profile_stat.yml' 49116 1727204701.40296: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204701.40300: getting variables 49116 1727204701.40302: in VariableManager get_vars() 49116 1727204701.40350: Calling all_inventory to load vars for managed-node3 49116 1727204701.40353: Calling groups_inventory to load vars for managed-node3 49116 1727204701.40355: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204701.40373: Calling all_plugins_play to load vars for managed-node3 49116 1727204701.40376: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204701.40380: Calling groups_plugins_play to load vars for managed-node3 49116 1727204701.41449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204701.42668: done with get_vars() 49116 1727204701.42696: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.045) 0:00:24.452 ***** 49116 1727204701.42782: entering _queue_task() for managed-node3/include_tasks 49116 1727204701.43080: worker is 1 (out of 1 available) 49116 1727204701.43094: exiting _queue_task() for managed-node3/include_tasks 49116 1727204701.43108: done queuing things up, now waiting for results queue to drain 49116 1727204701.43110: waiting for pending results... 49116 1727204701.43307: running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' 49116 1727204701.43395: in run() - task 127b8e07-fff9-02f7-957b-0000000006bf 49116 1727204701.43407: variable 'ansible_search_path' from source: unknown 49116 1727204701.43411: variable 'ansible_search_path' from source: unknown 49116 1727204701.43447: calling self._execute() 49116 1727204701.43529: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204701.43532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204701.43543: variable 'omit' from source: magic vars 49116 1727204701.43859: variable 'ansible_distribution_major_version' from source: facts 49116 1727204701.43872: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204701.43879: _execute() done 49116 1727204701.43882: dumping result to json 49116 1727204701.43887: done dumping result, returning 49116 1727204701.43895: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' [127b8e07-fff9-02f7-957b-0000000006bf] 49116 1727204701.43899: sending task result for task 127b8e07-fff9-02f7-957b-0000000006bf 49116 1727204701.44003: done sending task result for task 127b8e07-fff9-02f7-957b-0000000006bf 49116 1727204701.44006: WORKER PROCESS EXITING 49116 1727204701.44041: no more pending results, returning what we have 49116 1727204701.44046: in VariableManager get_vars() 49116 1727204701.44101: Calling all_inventory to load vars for managed-node3 49116 1727204701.44104: Calling groups_inventory to load vars for managed-node3 49116 1727204701.44106: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204701.44121: Calling all_plugins_play to load vars for managed-node3 49116 1727204701.44124: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204701.44127: Calling groups_plugins_play to load vars for managed-node3 49116 1727204701.45341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204701.46551: done with get_vars() 49116 1727204701.46580: variable 'ansible_search_path' from source: unknown 49116 1727204701.46581: variable 'ansible_search_path' from source: unknown 49116 1727204701.46614: we have included files to process 49116 1727204701.46615: generating all_blocks data 49116 1727204701.46617: done generating all_blocks data 49116 1727204701.46620: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 49116 1727204701.46621: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 49116 1727204701.46623: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 49116 1727204701.47301: done processing included file 49116 1727204701.47303: iterating over new_blocks loaded from include file 49116 1727204701.47304: in VariableManager get_vars() 49116 1727204701.47321: done with get_vars() 49116 1727204701.47323: filtering new block on tags 49116 1727204701.47345: done filtering new block on tags 49116 1727204701.47347: in VariableManager get_vars() 49116 1727204701.47362: done with get_vars() 49116 1727204701.47363: filtering new block on tags 49116 1727204701.47380: done filtering new block on tags 49116 1727204701.47381: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node3 49116 1727204701.47386: extending task lists for all hosts with included blocks 49116 1727204701.47505: done extending task lists 49116 1727204701.47506: done processing included files 49116 1727204701.47507: results queue empty 49116 1727204701.47507: checking for any_errors_fatal 49116 1727204701.47510: done checking for any_errors_fatal 49116 1727204701.47511: checking for max_fail_percentage 49116 1727204701.47511: done checking for max_fail_percentage 49116 1727204701.47512: checking to see if all hosts have failed and the running result is not ok 49116 1727204701.47513: done checking to see if all hosts have failed 49116 1727204701.47513: getting the remaining hosts for this loop 49116 1727204701.47514: done getting the remaining hosts for this loop 49116 1727204701.47516: getting the next task for host managed-node3 49116 1727204701.47518: done getting next task for host managed-node3 49116 1727204701.47520: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 49116 1727204701.47522: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204701.47524: getting variables 49116 1727204701.47524: in VariableManager get_vars() 49116 1727204701.47536: Calling all_inventory to load vars for managed-node3 49116 1727204701.47538: Calling groups_inventory to load vars for managed-node3 49116 1727204701.47540: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204701.47546: Calling all_plugins_play to load vars for managed-node3 49116 1727204701.47548: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204701.47550: Calling groups_plugins_play to load vars for managed-node3 49116 1727204701.48528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204701.49742: done with get_vars() 49116 1727204701.49774: done getting variables 49116 1727204701.49813: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.070) 0:00:24.523 ***** 49116 1727204701.49842: entering _queue_task() for managed-node3/set_fact 49116 1727204701.50148: worker is 1 (out of 1 available) 49116 1727204701.50161: exiting _queue_task() for managed-node3/set_fact 49116 1727204701.50176: done queuing things up, now waiting for results queue to drain 49116 1727204701.50178: waiting for pending results... 49116 1727204701.50369: running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag 49116 1727204701.50452: in run() - task 127b8e07-fff9-02f7-957b-000000000838 49116 1727204701.50468: variable 'ansible_search_path' from source: unknown 49116 1727204701.50472: variable 'ansible_search_path' from source: unknown 49116 1727204701.50509: calling self._execute() 49116 1727204701.50595: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204701.50599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204701.50610: variable 'omit' from source: magic vars 49116 1727204701.50927: variable 'ansible_distribution_major_version' from source: facts 49116 1727204701.50938: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204701.50945: variable 'omit' from source: magic vars 49116 1727204701.50989: variable 'omit' from source: magic vars 49116 1727204701.51018: variable 'omit' from source: magic vars 49116 1727204701.51057: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204701.51094: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204701.51111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204701.51127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204701.51138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204701.51163: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204701.51169: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204701.51173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204701.51246: Set connection var ansible_connection to ssh 49116 1727204701.51257: Set connection var ansible_timeout to 10 49116 1727204701.51264: Set connection var ansible_shell_executable to /bin/sh 49116 1727204701.51271: Set connection var ansible_pipelining to False 49116 1727204701.51274: Set connection var ansible_shell_type to sh 49116 1727204701.51280: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204701.51302: variable 'ansible_shell_executable' from source: unknown 49116 1727204701.51306: variable 'ansible_connection' from source: unknown 49116 1727204701.51310: variable 'ansible_module_compression' from source: unknown 49116 1727204701.51313: variable 'ansible_shell_type' from source: unknown 49116 1727204701.51315: variable 'ansible_shell_executable' from source: unknown 49116 1727204701.51318: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204701.51320: variable 'ansible_pipelining' from source: unknown 49116 1727204701.51323: variable 'ansible_timeout' from source: unknown 49116 1727204701.51325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204701.51441: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204701.51453: variable 'omit' from source: magic vars 49116 1727204701.51458: starting attempt loop 49116 1727204701.51462: running the handler 49116 1727204701.51475: handler run complete 49116 1727204701.51484: attempt loop complete, returning result 49116 1727204701.51487: _execute() done 49116 1727204701.51489: dumping result to json 49116 1727204701.51492: done dumping result, returning 49116 1727204701.51501: done running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag [127b8e07-fff9-02f7-957b-000000000838] 49116 1727204701.51503: sending task result for task 127b8e07-fff9-02f7-957b-000000000838 49116 1727204701.51596: done sending task result for task 127b8e07-fff9-02f7-957b-000000000838 49116 1727204701.51599: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 49116 1727204701.51680: no more pending results, returning what we have 49116 1727204701.51683: results queue empty 49116 1727204701.51684: checking for any_errors_fatal 49116 1727204701.51686: done checking for any_errors_fatal 49116 1727204701.51687: checking for max_fail_percentage 49116 1727204701.51688: done checking for max_fail_percentage 49116 1727204701.51689: checking to see if all hosts have failed and the running result is not ok 49116 1727204701.51690: done checking to see if all hosts have failed 49116 1727204701.51691: getting the remaining hosts for this loop 49116 1727204701.51693: done getting the remaining hosts for this loop 49116 1727204701.51697: getting the next task for host managed-node3 49116 1727204701.51704: done getting next task for host managed-node3 49116 1727204701.51707: ^ task is: TASK: Stat profile file 49116 1727204701.51710: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204701.51714: getting variables 49116 1727204701.51716: in VariableManager get_vars() 49116 1727204701.51763: Calling all_inventory to load vars for managed-node3 49116 1727204701.51768: Calling groups_inventory to load vars for managed-node3 49116 1727204701.51770: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204701.51781: Calling all_plugins_play to load vars for managed-node3 49116 1727204701.51783: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204701.51785: Calling groups_plugins_play to load vars for managed-node3 49116 1727204701.52804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204701.54010: done with get_vars() 49116 1727204701.54043: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.042) 0:00:24.566 ***** 49116 1727204701.54123: entering _queue_task() for managed-node3/stat 49116 1727204701.54425: worker is 1 (out of 1 available) 49116 1727204701.54445: exiting _queue_task() for managed-node3/stat 49116 1727204701.54459: done queuing things up, now waiting for results queue to drain 49116 1727204701.54460: waiting for pending results... 49116 1727204701.54650: running TaskExecutor() for managed-node3/TASK: Stat profile file 49116 1727204701.54742: in run() - task 127b8e07-fff9-02f7-957b-000000000839 49116 1727204701.54753: variable 'ansible_search_path' from source: unknown 49116 1727204701.54757: variable 'ansible_search_path' from source: unknown 49116 1727204701.54791: calling self._execute() 49116 1727204701.54876: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204701.54880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204701.54890: variable 'omit' from source: magic vars 49116 1727204701.55201: variable 'ansible_distribution_major_version' from source: facts 49116 1727204701.55213: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204701.55223: variable 'omit' from source: magic vars 49116 1727204701.55263: variable 'omit' from source: magic vars 49116 1727204701.55346: variable 'profile' from source: include params 49116 1727204701.55350: variable 'item' from source: include params 49116 1727204701.55399: variable 'item' from source: include params 49116 1727204701.55415: variable 'omit' from source: magic vars 49116 1727204701.55455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204701.55492: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204701.55509: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204701.55523: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204701.55537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204701.55563: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204701.55568: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204701.55571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204701.55646: Set connection var ansible_connection to ssh 49116 1727204701.55658: Set connection var ansible_timeout to 10 49116 1727204701.55666: Set connection var ansible_shell_executable to /bin/sh 49116 1727204701.55672: Set connection var ansible_pipelining to False 49116 1727204701.55677: Set connection var ansible_shell_type to sh 49116 1727204701.55680: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204701.55703: variable 'ansible_shell_executable' from source: unknown 49116 1727204701.55706: variable 'ansible_connection' from source: unknown 49116 1727204701.55709: variable 'ansible_module_compression' from source: unknown 49116 1727204701.55712: variable 'ansible_shell_type' from source: unknown 49116 1727204701.55714: variable 'ansible_shell_executable' from source: unknown 49116 1727204701.55717: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204701.55719: variable 'ansible_pipelining' from source: unknown 49116 1727204701.55722: variable 'ansible_timeout' from source: unknown 49116 1727204701.55727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204701.55894: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49116 1727204701.55904: variable 'omit' from source: magic vars 49116 1727204701.55911: starting attempt loop 49116 1727204701.55914: running the handler 49116 1727204701.55926: _low_level_execute_command(): starting 49116 1727204701.55935: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204701.56512: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204701.56518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204701.56522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204701.56574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204701.56588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204701.56676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204701.58527: stdout chunk (state=3): >>>/root <<< 49116 1727204701.58634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204701.58703: stderr chunk (state=3): >>><<< 49116 1727204701.58707: stdout chunk (state=3): >>><<< 49116 1727204701.58728: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204701.58743: _low_level_execute_command(): starting 49116 1727204701.58749: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204701.5872889-50671-35330006975262 `" && echo ansible-tmp-1727204701.5872889-50671-35330006975262="` echo /root/.ansible/tmp/ansible-tmp-1727204701.5872889-50671-35330006975262 `" ) && sleep 0' 49116 1727204701.59259: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204701.59263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204701.59275: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 49116 1727204701.59279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204701.59321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204701.59325: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204701.59328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204701.59408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204701.61581: stdout chunk (state=3): >>>ansible-tmp-1727204701.5872889-50671-35330006975262=/root/.ansible/tmp/ansible-tmp-1727204701.5872889-50671-35330006975262 <<< 49116 1727204701.61695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204701.61762: stderr chunk (state=3): >>><<< 49116 1727204701.61768: stdout chunk (state=3): >>><<< 49116 1727204701.61786: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204701.5872889-50671-35330006975262=/root/.ansible/tmp/ansible-tmp-1727204701.5872889-50671-35330006975262 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204701.61830: variable 'ansible_module_compression' from source: unknown 49116 1727204701.61881: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 49116 1727204701.61915: variable 'ansible_facts' from source: unknown 49116 1727204701.61984: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204701.5872889-50671-35330006975262/AnsiballZ_stat.py 49116 1727204701.62100: Sending initial data 49116 1727204701.62103: Sent initial data (152 bytes) 49116 1727204701.62578: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204701.62581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204701.62612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204701.62615: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204701.62617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204701.62620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204701.62682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204701.62686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204701.62688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204701.62766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204701.64575: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204701.64639: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204701.64707: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpvha3ap1w /root/.ansible/tmp/ansible-tmp-1727204701.5872889-50671-35330006975262/AnsiballZ_stat.py <<< 49116 1727204701.64716: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204701.5872889-50671-35330006975262/AnsiballZ_stat.py" <<< 49116 1727204701.64781: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpvha3ap1w" to remote "/root/.ansible/tmp/ansible-tmp-1727204701.5872889-50671-35330006975262/AnsiballZ_stat.py" <<< 49116 1727204701.64784: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204701.5872889-50671-35330006975262/AnsiballZ_stat.py" <<< 49116 1727204701.65473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204701.65545: stderr chunk (state=3): >>><<< 49116 1727204701.65549: stdout chunk (state=3): >>><<< 49116 1727204701.65578: done transferring module to remote 49116 1727204701.65588: _low_level_execute_command(): starting 49116 1727204701.65593: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204701.5872889-50671-35330006975262/ /root/.ansible/tmp/ansible-tmp-1727204701.5872889-50671-35330006975262/AnsiballZ_stat.py && sleep 0' 49116 1727204701.66063: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204701.66069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 49116 1727204701.66097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204701.66100: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204701.66107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204701.66165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204701.66171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204701.66178: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204701.66290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204701.68320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204701.68402: stderr chunk (state=3): >>><<< 49116 1727204701.68406: stdout chunk (state=3): >>><<< 49116 1727204701.68409: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204701.68412: _low_level_execute_command(): starting 49116 1727204701.68414: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204701.5872889-50671-35330006975262/AnsiballZ_stat.py && sleep 0' 49116 1727204701.68909: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204701.68914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204701.68916: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204701.68918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204701.68975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204701.68979: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204701.69099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204701.86986: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} <<< 49116 1727204701.88597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204701.88643: stderr chunk (state=3): >>><<< 49116 1727204701.88655: stdout chunk (state=3): >>><<< 49116 1727204701.88685: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204701.88729: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr101.90', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204701.5872889-50671-35330006975262/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204701.88831: _low_level_execute_command(): starting 49116 1727204701.88837: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204701.5872889-50671-35330006975262/ > /dev/null 2>&1 && sleep 0' 49116 1727204701.89576: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204701.89638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204701.89712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204701.92101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204701.92105: stdout chunk (state=3): >>><<< 49116 1727204701.92107: stderr chunk (state=3): >>><<< 49116 1727204701.92123: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204701.92172: handler run complete 49116 1727204701.92179: attempt loop complete, returning result 49116 1727204701.92185: _execute() done 49116 1727204701.92190: dumping result to json 49116 1727204701.92198: done dumping result, returning 49116 1727204701.92210: done running TaskExecutor() for managed-node3/TASK: Stat profile file [127b8e07-fff9-02f7-957b-000000000839] 49116 1727204701.92219: sending task result for task 127b8e07-fff9-02f7-957b-000000000839 49116 1727204701.92575: done sending task result for task 127b8e07-fff9-02f7-957b-000000000839 49116 1727204701.92579: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 49116 1727204701.92650: no more pending results, returning what we have 49116 1727204701.92654: results queue empty 49116 1727204701.92655: checking for any_errors_fatal 49116 1727204701.92660: done checking for any_errors_fatal 49116 1727204701.92661: checking for max_fail_percentage 49116 1727204701.92663: done checking for max_fail_percentage 49116 1727204701.92664: checking to see if all hosts have failed and the running result is not ok 49116 1727204701.92665: done checking to see if all hosts have failed 49116 1727204701.92667: getting the remaining hosts for this loop 49116 1727204701.92669: done getting the remaining hosts for this loop 49116 1727204701.92673: getting the next task for host managed-node3 49116 1727204701.92681: done getting next task for host managed-node3 49116 1727204701.92684: ^ task is: TASK: Set NM profile exist flag based on the profile files 49116 1727204701.92692: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204701.92697: getting variables 49116 1727204701.92698: in VariableManager get_vars() 49116 1727204701.92748: Calling all_inventory to load vars for managed-node3 49116 1727204701.92751: Calling groups_inventory to load vars for managed-node3 49116 1727204701.92754: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204701.92804: Calling all_plugins_play to load vars for managed-node3 49116 1727204701.92809: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204701.92813: Calling groups_plugins_play to load vars for managed-node3 49116 1727204701.94977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204701.97344: done with get_vars() 49116 1727204701.97396: done getting variables 49116 1727204701.97481: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:05:01 -0400 (0:00:00.433) 0:00:25.000 ***** 49116 1727204701.97518: entering _queue_task() for managed-node3/set_fact 49116 1727204701.98008: worker is 1 (out of 1 available) 49116 1727204701.98028: exiting _queue_task() for managed-node3/set_fact 49116 1727204701.98046: done queuing things up, now waiting for results queue to drain 49116 1727204701.98047: waiting for pending results... 49116 1727204701.98575: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files 49116 1727204701.98582: in run() - task 127b8e07-fff9-02f7-957b-00000000083a 49116 1727204701.98584: variable 'ansible_search_path' from source: unknown 49116 1727204701.98587: variable 'ansible_search_path' from source: unknown 49116 1727204701.98589: calling self._execute() 49116 1727204701.98709: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204701.98725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204701.98747: variable 'omit' from source: magic vars 49116 1727204701.99224: variable 'ansible_distribution_major_version' from source: facts 49116 1727204701.99257: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204701.99414: variable 'profile_stat' from source: set_fact 49116 1727204701.99440: Evaluated conditional (profile_stat.stat.exists): False 49116 1727204701.99449: when evaluation is False, skipping this task 49116 1727204701.99457: _execute() done 49116 1727204701.99472: dumping result to json 49116 1727204701.99484: done dumping result, returning 49116 1727204701.99497: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files [127b8e07-fff9-02f7-957b-00000000083a] 49116 1727204701.99508: sending task result for task 127b8e07-fff9-02f7-957b-00000000083a skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 49116 1727204701.99750: no more pending results, returning what we have 49116 1727204701.99756: results queue empty 49116 1727204701.99757: checking for any_errors_fatal 49116 1727204701.99771: done checking for any_errors_fatal 49116 1727204701.99773: checking for max_fail_percentage 49116 1727204701.99775: done checking for max_fail_percentage 49116 1727204701.99776: checking to see if all hosts have failed and the running result is not ok 49116 1727204701.99777: done checking to see if all hosts have failed 49116 1727204701.99778: getting the remaining hosts for this loop 49116 1727204701.99779: done getting the remaining hosts for this loop 49116 1727204701.99785: getting the next task for host managed-node3 49116 1727204701.99867: done getting next task for host managed-node3 49116 1727204701.99872: ^ task is: TASK: Get NM profile info 49116 1727204701.99877: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204701.99882: getting variables 49116 1727204701.99884: in VariableManager get_vars() 49116 1727204701.99935: Calling all_inventory to load vars for managed-node3 49116 1727204701.99939: Calling groups_inventory to load vars for managed-node3 49116 1727204701.99941: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204701.99957: Calling all_plugins_play to load vars for managed-node3 49116 1727204701.99961: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204701.99964: Calling groups_plugins_play to load vars for managed-node3 49116 1727204702.00702: done sending task result for task 127b8e07-fff9-02f7-957b-00000000083a 49116 1727204702.00707: WORKER PROCESS EXITING 49116 1727204702.02261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204702.04580: done with get_vars() 49116 1727204702.04620: done getting variables 49116 1727204702.04699: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:05:02 -0400 (0:00:00.072) 0:00:25.072 ***** 49116 1727204702.04739: entering _queue_task() for managed-node3/shell 49116 1727204702.05149: worker is 1 (out of 1 available) 49116 1727204702.05164: exiting _queue_task() for managed-node3/shell 49116 1727204702.05379: done queuing things up, now waiting for results queue to drain 49116 1727204702.05381: waiting for pending results... 49116 1727204702.05499: running TaskExecutor() for managed-node3/TASK: Get NM profile info 49116 1727204702.05648: in run() - task 127b8e07-fff9-02f7-957b-00000000083b 49116 1727204702.05676: variable 'ansible_search_path' from source: unknown 49116 1727204702.05683: variable 'ansible_search_path' from source: unknown 49116 1727204702.05737: calling self._execute() 49116 1727204702.05857: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204702.05945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204702.05950: variable 'omit' from source: magic vars 49116 1727204702.06341: variable 'ansible_distribution_major_version' from source: facts 49116 1727204702.06362: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204702.06382: variable 'omit' from source: magic vars 49116 1727204702.06446: variable 'omit' from source: magic vars 49116 1727204702.06581: variable 'profile' from source: include params 49116 1727204702.06598: variable 'item' from source: include params 49116 1727204702.06682: variable 'item' from source: include params 49116 1727204702.06715: variable 'omit' from source: magic vars 49116 1727204702.06773: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204702.06827: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204702.06916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204702.06920: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204702.06922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204702.06941: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204702.06950: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204702.06958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204702.07086: Set connection var ansible_connection to ssh 49116 1727204702.07106: Set connection var ansible_timeout to 10 49116 1727204702.07119: Set connection var ansible_shell_executable to /bin/sh 49116 1727204702.07136: Set connection var ansible_pipelining to False 49116 1727204702.07148: Set connection var ansible_shell_type to sh 49116 1727204702.07159: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204702.07193: variable 'ansible_shell_executable' from source: unknown 49116 1727204702.07242: variable 'ansible_connection' from source: unknown 49116 1727204702.07245: variable 'ansible_module_compression' from source: unknown 49116 1727204702.07248: variable 'ansible_shell_type' from source: unknown 49116 1727204702.07252: variable 'ansible_shell_executable' from source: unknown 49116 1727204702.07258: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204702.07260: variable 'ansible_pipelining' from source: unknown 49116 1727204702.07263: variable 'ansible_timeout' from source: unknown 49116 1727204702.07266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204702.07424: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204702.07447: variable 'omit' from source: magic vars 49116 1727204702.07471: starting attempt loop 49116 1727204702.07474: running the handler 49116 1727204702.07569: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204702.07575: _low_level_execute_command(): starting 49116 1727204702.07578: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204702.08389: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204702.08456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 49116 1727204702.08540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204702.08561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204702.08584: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204702.08598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204702.08712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204702.10597: stdout chunk (state=3): >>>/root <<< 49116 1727204702.10837: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204702.10842: stdout chunk (state=3): >>><<< 49116 1727204702.10845: stderr chunk (state=3): >>><<< 49116 1727204702.10874: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204702.10898: _low_level_execute_command(): starting 49116 1727204702.10945: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204702.1088307-50685-163086140462921 `" && echo ansible-tmp-1727204702.1088307-50685-163086140462921="` echo /root/.ansible/tmp/ansible-tmp-1727204702.1088307-50685-163086140462921 `" ) && sleep 0' 49116 1727204702.11715: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204702.11751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204702.11820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204702.11824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204702.11911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204702.14128: stdout chunk (state=3): >>>ansible-tmp-1727204702.1088307-50685-163086140462921=/root/.ansible/tmp/ansible-tmp-1727204702.1088307-50685-163086140462921 <<< 49116 1727204702.14298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204702.14369: stderr chunk (state=3): >>><<< 49116 1727204702.14414: stdout chunk (state=3): >>><<< 49116 1727204702.14463: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204702.1088307-50685-163086140462921=/root/.ansible/tmp/ansible-tmp-1727204702.1088307-50685-163086140462921 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204702.14481: variable 'ansible_module_compression' from source: unknown 49116 1727204702.14548: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 49116 1727204702.14584: variable 'ansible_facts' from source: unknown 49116 1727204702.14644: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204702.1088307-50685-163086140462921/AnsiballZ_command.py 49116 1727204702.14764: Sending initial data 49116 1727204702.14770: Sent initial data (156 bytes) 49116 1727204702.15254: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204702.15258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204702.15292: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204702.15296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204702.15355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204702.15358: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204702.15368: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204702.15437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204702.17270: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204702.17325: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204702.17397: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpcqy_doz8 /root/.ansible/tmp/ansible-tmp-1727204702.1088307-50685-163086140462921/AnsiballZ_command.py <<< 49116 1727204702.17400: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204702.1088307-50685-163086140462921/AnsiballZ_command.py" <<< 49116 1727204702.17463: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpcqy_doz8" to remote "/root/.ansible/tmp/ansible-tmp-1727204702.1088307-50685-163086140462921/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204702.1088307-50685-163086140462921/AnsiballZ_command.py" <<< 49116 1727204702.18297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204702.18323: stderr chunk (state=3): >>><<< 49116 1727204702.18422: stdout chunk (state=3): >>><<< 49116 1727204702.18426: done transferring module to remote 49116 1727204702.18428: _low_level_execute_command(): starting 49116 1727204702.18431: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204702.1088307-50685-163086140462921/ /root/.ansible/tmp/ansible-tmp-1727204702.1088307-50685-163086140462921/AnsiballZ_command.py && sleep 0' 49116 1727204702.18840: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204702.18855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204702.18885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204702.18928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204702.18931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204702.18934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204702.19017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204702.21139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204702.21144: stdout chunk (state=3): >>><<< 49116 1727204702.21373: stderr chunk (state=3): >>><<< 49116 1727204702.21379: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204702.21382: _low_level_execute_command(): starting 49116 1727204702.21384: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204702.1088307-50685-163086140462921/AnsiballZ_command.py && sleep 0' 49116 1727204702.21801: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204702.21805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204702.21824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 49116 1727204702.21843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 49116 1727204702.21846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204702.21902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204702.21905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204702.21995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204702.42260: stdout chunk (state=3): >>> {"changed": true, "stdout": "lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "start": "2024-09-24 15:05:02.399341", "end": "2024-09-24 15:05:02.421290", "delta": "0:00:00.021949", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49116 1727204702.44081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204702.44142: stderr chunk (state=3): >>><<< 49116 1727204702.44146: stdout chunk (state=3): >>><<< 49116 1727204702.44163: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "start": "2024-09-24 15:05:02.399341", "end": "2024-09-24 15:05:02.421290", "delta": "0:00:00.021949", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204702.44198: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204702.1088307-50685-163086140462921/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204702.44208: _low_level_execute_command(): starting 49116 1727204702.44211: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204702.1088307-50685-163086140462921/ > /dev/null 2>&1 && sleep 0' 49116 1727204702.44715: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204702.44723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204702.44726: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204702.44729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 49116 1727204702.44736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204702.44784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204702.44788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204702.44790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204702.44865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204702.46918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204702.46980: stderr chunk (state=3): >>><<< 49116 1727204702.46984: stdout chunk (state=3): >>><<< 49116 1727204702.47002: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204702.47009: handler run complete 49116 1727204702.47031: Evaluated conditional (False): False 49116 1727204702.47041: attempt loop complete, returning result 49116 1727204702.47044: _execute() done 49116 1727204702.47046: dumping result to json 49116 1727204702.47051: done dumping result, returning 49116 1727204702.47059: done running TaskExecutor() for managed-node3/TASK: Get NM profile info [127b8e07-fff9-02f7-957b-00000000083b] 49116 1727204702.47066: sending task result for task 127b8e07-fff9-02f7-957b-00000000083b 49116 1727204702.47174: done sending task result for task 127b8e07-fff9-02f7-957b-00000000083b 49116 1727204702.47177: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "delta": "0:00:00.021949", "end": "2024-09-24 15:05:02.421290", "rc": 0, "start": "2024-09-24 15:05:02.399341" } STDOUT: lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection 49116 1727204702.47257: no more pending results, returning what we have 49116 1727204702.47261: results queue empty 49116 1727204702.47262: checking for any_errors_fatal 49116 1727204702.47273: done checking for any_errors_fatal 49116 1727204702.47274: checking for max_fail_percentage 49116 1727204702.47276: done checking for max_fail_percentage 49116 1727204702.47277: checking to see if all hosts have failed and the running result is not ok 49116 1727204702.47278: done checking to see if all hosts have failed 49116 1727204702.47279: getting the remaining hosts for this loop 49116 1727204702.47280: done getting the remaining hosts for this loop 49116 1727204702.47289: getting the next task for host managed-node3 49116 1727204702.47296: done getting next task for host managed-node3 49116 1727204702.47299: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 49116 1727204702.47303: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204702.47307: getting variables 49116 1727204702.47309: in VariableManager get_vars() 49116 1727204702.47352: Calling all_inventory to load vars for managed-node3 49116 1727204702.47355: Calling groups_inventory to load vars for managed-node3 49116 1727204702.47357: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204702.47371: Calling all_plugins_play to load vars for managed-node3 49116 1727204702.47374: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204702.47381: Calling groups_plugins_play to load vars for managed-node3 49116 1727204702.48417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204702.49625: done with get_vars() 49116 1727204702.49660: done getting variables 49116 1727204702.49713: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:05:02 -0400 (0:00:00.450) 0:00:25.522 ***** 49116 1727204702.49743: entering _queue_task() for managed-node3/set_fact 49116 1727204702.50042: worker is 1 (out of 1 available) 49116 1727204702.50057: exiting _queue_task() for managed-node3/set_fact 49116 1727204702.50072: done queuing things up, now waiting for results queue to drain 49116 1727204702.50075: waiting for pending results... 49116 1727204702.50274: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 49116 1727204702.50362: in run() - task 127b8e07-fff9-02f7-957b-00000000083c 49116 1727204702.50378: variable 'ansible_search_path' from source: unknown 49116 1727204702.50382: variable 'ansible_search_path' from source: unknown 49116 1727204702.50419: calling self._execute() 49116 1727204702.50502: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204702.50507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204702.50517: variable 'omit' from source: magic vars 49116 1727204702.50828: variable 'ansible_distribution_major_version' from source: facts 49116 1727204702.50842: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204702.50947: variable 'nm_profile_exists' from source: set_fact 49116 1727204702.50961: Evaluated conditional (nm_profile_exists.rc == 0): True 49116 1727204702.50970: variable 'omit' from source: magic vars 49116 1727204702.51015: variable 'omit' from source: magic vars 49116 1727204702.51040: variable 'omit' from source: magic vars 49116 1727204702.51080: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204702.51112: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204702.51130: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204702.51147: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204702.51157: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204702.51187: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204702.51191: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204702.51194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204702.51271: Set connection var ansible_connection to ssh 49116 1727204702.51284: Set connection var ansible_timeout to 10 49116 1727204702.51292: Set connection var ansible_shell_executable to /bin/sh 49116 1727204702.51297: Set connection var ansible_pipelining to False 49116 1727204702.51300: Set connection var ansible_shell_type to sh 49116 1727204702.51308: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204702.51327: variable 'ansible_shell_executable' from source: unknown 49116 1727204702.51330: variable 'ansible_connection' from source: unknown 49116 1727204702.51336: variable 'ansible_module_compression' from source: unknown 49116 1727204702.51338: variable 'ansible_shell_type' from source: unknown 49116 1727204702.51341: variable 'ansible_shell_executable' from source: unknown 49116 1727204702.51343: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204702.51346: variable 'ansible_pipelining' from source: unknown 49116 1727204702.51349: variable 'ansible_timeout' from source: unknown 49116 1727204702.51351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204702.51470: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204702.51480: variable 'omit' from source: magic vars 49116 1727204702.51486: starting attempt loop 49116 1727204702.51490: running the handler 49116 1727204702.51503: handler run complete 49116 1727204702.51512: attempt loop complete, returning result 49116 1727204702.51515: _execute() done 49116 1727204702.51518: dumping result to json 49116 1727204702.51522: done dumping result, returning 49116 1727204702.51530: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [127b8e07-fff9-02f7-957b-00000000083c] 49116 1727204702.51537: sending task result for task 127b8e07-fff9-02f7-957b-00000000083c 49116 1727204702.51628: done sending task result for task 127b8e07-fff9-02f7-957b-00000000083c 49116 1727204702.51634: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 49116 1727204702.51699: no more pending results, returning what we have 49116 1727204702.51702: results queue empty 49116 1727204702.51703: checking for any_errors_fatal 49116 1727204702.51713: done checking for any_errors_fatal 49116 1727204702.51714: checking for max_fail_percentage 49116 1727204702.51716: done checking for max_fail_percentage 49116 1727204702.51717: checking to see if all hosts have failed and the running result is not ok 49116 1727204702.51718: done checking to see if all hosts have failed 49116 1727204702.51719: getting the remaining hosts for this loop 49116 1727204702.51720: done getting the remaining hosts for this loop 49116 1727204702.51724: getting the next task for host managed-node3 49116 1727204702.51737: done getting next task for host managed-node3 49116 1727204702.51739: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 49116 1727204702.51744: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204702.51749: getting variables 49116 1727204702.51750: in VariableManager get_vars() 49116 1727204702.51804: Calling all_inventory to load vars for managed-node3 49116 1727204702.51807: Calling groups_inventory to load vars for managed-node3 49116 1727204702.51809: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204702.51823: Calling all_plugins_play to load vars for managed-node3 49116 1727204702.51826: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204702.51829: Calling groups_plugins_play to load vars for managed-node3 49116 1727204702.53009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204702.54269: done with get_vars() 49116 1727204702.54302: done getting variables 49116 1727204702.54357: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204702.54460: variable 'profile' from source: include params 49116 1727204702.54464: variable 'item' from source: include params 49116 1727204702.54511: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-lsr101.90] ********************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:05:02 -0400 (0:00:00.048) 0:00:25.570 ***** 49116 1727204702.54546: entering _queue_task() for managed-node3/command 49116 1727204702.54843: worker is 1 (out of 1 available) 49116 1727204702.54858: exiting _queue_task() for managed-node3/command 49116 1727204702.54873: done queuing things up, now waiting for results queue to drain 49116 1727204702.54874: waiting for pending results... 49116 1727204702.55080: running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-lsr101.90 49116 1727204702.55170: in run() - task 127b8e07-fff9-02f7-957b-00000000083e 49116 1727204702.55185: variable 'ansible_search_path' from source: unknown 49116 1727204702.55188: variable 'ansible_search_path' from source: unknown 49116 1727204702.55224: calling self._execute() 49116 1727204702.55308: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204702.55312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204702.55326: variable 'omit' from source: magic vars 49116 1727204702.55637: variable 'ansible_distribution_major_version' from source: facts 49116 1727204702.55646: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204702.55873: variable 'profile_stat' from source: set_fact 49116 1727204702.55877: Evaluated conditional (profile_stat.stat.exists): False 49116 1727204702.55879: when evaluation is False, skipping this task 49116 1727204702.55882: _execute() done 49116 1727204702.55884: dumping result to json 49116 1727204702.55886: done dumping result, returning 49116 1727204702.55889: done running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-lsr101.90 [127b8e07-fff9-02f7-957b-00000000083e] 49116 1727204702.55895: sending task result for task 127b8e07-fff9-02f7-957b-00000000083e skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 49116 1727204702.56184: no more pending results, returning what we have 49116 1727204702.56190: results queue empty 49116 1727204702.56191: checking for any_errors_fatal 49116 1727204702.56199: done checking for any_errors_fatal 49116 1727204702.56200: checking for max_fail_percentage 49116 1727204702.56202: done checking for max_fail_percentage 49116 1727204702.56203: checking to see if all hosts have failed and the running result is not ok 49116 1727204702.56204: done checking to see if all hosts have failed 49116 1727204702.56205: getting the remaining hosts for this loop 49116 1727204702.56206: done getting the remaining hosts for this loop 49116 1727204702.56212: getting the next task for host managed-node3 49116 1727204702.56221: done getting next task for host managed-node3 49116 1727204702.56224: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 49116 1727204702.56229: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204702.56238: getting variables 49116 1727204702.56240: in VariableManager get_vars() 49116 1727204702.56411: Calling all_inventory to load vars for managed-node3 49116 1727204702.56415: Calling groups_inventory to load vars for managed-node3 49116 1727204702.56418: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204702.56436: Calling all_plugins_play to load vars for managed-node3 49116 1727204702.56440: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204702.56443: Calling groups_plugins_play to load vars for managed-node3 49116 1727204702.57563: done sending task result for task 127b8e07-fff9-02f7-957b-00000000083e 49116 1727204702.57571: WORKER PROCESS EXITING 49116 1727204702.58479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204702.60516: done with get_vars() 49116 1727204702.60564: done getting variables 49116 1727204702.60737: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204702.60863: variable 'profile' from source: include params 49116 1727204702.60869: variable 'item' from source: include params 49116 1727204702.60934: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-lsr101.90] ******************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:05:02 -0400 (0:00:00.064) 0:00:25.634 ***** 49116 1727204702.60972: entering _queue_task() for managed-node3/set_fact 49116 1727204702.61355: worker is 1 (out of 1 available) 49116 1727204702.61369: exiting _queue_task() for managed-node3/set_fact 49116 1727204702.61382: done queuing things up, now waiting for results queue to drain 49116 1727204702.61384: waiting for pending results... 49116 1727204702.61902: running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-lsr101.90 49116 1727204702.61908: in run() - task 127b8e07-fff9-02f7-957b-00000000083f 49116 1727204702.61911: variable 'ansible_search_path' from source: unknown 49116 1727204702.61913: variable 'ansible_search_path' from source: unknown 49116 1727204702.61916: calling self._execute() 49116 1727204702.62173: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204702.62178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204702.62181: variable 'omit' from source: magic vars 49116 1727204702.62421: variable 'ansible_distribution_major_version' from source: facts 49116 1727204702.62480: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204702.62563: variable 'profile_stat' from source: set_fact 49116 1727204702.62580: Evaluated conditional (profile_stat.stat.exists): False 49116 1727204702.62583: when evaluation is False, skipping this task 49116 1727204702.62587: _execute() done 49116 1727204702.62590: dumping result to json 49116 1727204702.62592: done dumping result, returning 49116 1727204702.62601: done running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-lsr101.90 [127b8e07-fff9-02f7-957b-00000000083f] 49116 1727204702.62606: sending task result for task 127b8e07-fff9-02f7-957b-00000000083f 49116 1727204702.62912: done sending task result for task 127b8e07-fff9-02f7-957b-00000000083f 49116 1727204702.62916: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 49116 1727204702.62963: no more pending results, returning what we have 49116 1727204702.62969: results queue empty 49116 1727204702.62970: checking for any_errors_fatal 49116 1727204702.62977: done checking for any_errors_fatal 49116 1727204702.62977: checking for max_fail_percentage 49116 1727204702.62979: done checking for max_fail_percentage 49116 1727204702.62980: checking to see if all hosts have failed and the running result is not ok 49116 1727204702.62981: done checking to see if all hosts have failed 49116 1727204702.62982: getting the remaining hosts for this loop 49116 1727204702.62983: done getting the remaining hosts for this loop 49116 1727204702.62987: getting the next task for host managed-node3 49116 1727204702.62994: done getting next task for host managed-node3 49116 1727204702.62996: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 49116 1727204702.63000: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204702.63004: getting variables 49116 1727204702.63006: in VariableManager get_vars() 49116 1727204702.63046: Calling all_inventory to load vars for managed-node3 49116 1727204702.63049: Calling groups_inventory to load vars for managed-node3 49116 1727204702.63051: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204702.63064: Calling all_plugins_play to load vars for managed-node3 49116 1727204702.63068: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204702.63072: Calling groups_plugins_play to load vars for managed-node3 49116 1727204702.64829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204702.67021: done with get_vars() 49116 1727204702.67061: done getting variables 49116 1727204702.67132: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204702.67258: variable 'profile' from source: include params 49116 1727204702.67262: variable 'item' from source: include params 49116 1727204702.67324: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-lsr101.90] ************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:05:02 -0400 (0:00:00.063) 0:00:25.698 ***** 49116 1727204702.67359: entering _queue_task() for managed-node3/command 49116 1727204702.67749: worker is 1 (out of 1 available) 49116 1727204702.67763: exiting _queue_task() for managed-node3/command 49116 1727204702.67778: done queuing things up, now waiting for results queue to drain 49116 1727204702.67779: waiting for pending results... 49116 1727204702.68190: running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-lsr101.90 49116 1727204702.68209: in run() - task 127b8e07-fff9-02f7-957b-000000000840 49116 1727204702.68228: variable 'ansible_search_path' from source: unknown 49116 1727204702.68232: variable 'ansible_search_path' from source: unknown 49116 1727204702.68271: calling self._execute() 49116 1727204702.68672: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204702.68676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204702.68680: variable 'omit' from source: magic vars 49116 1727204702.68807: variable 'ansible_distribution_major_version' from source: facts 49116 1727204702.68819: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204702.68958: variable 'profile_stat' from source: set_fact 49116 1727204702.68980: Evaluated conditional (profile_stat.stat.exists): False 49116 1727204702.68983: when evaluation is False, skipping this task 49116 1727204702.68987: _execute() done 49116 1727204702.68989: dumping result to json 49116 1727204702.68992: done dumping result, returning 49116 1727204702.69001: done running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-lsr101.90 [127b8e07-fff9-02f7-957b-000000000840] 49116 1727204702.69007: sending task result for task 127b8e07-fff9-02f7-957b-000000000840 49116 1727204702.69107: done sending task result for task 127b8e07-fff9-02f7-957b-000000000840 49116 1727204702.69110: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 49116 1727204702.69172: no more pending results, returning what we have 49116 1727204702.69177: results queue empty 49116 1727204702.69178: checking for any_errors_fatal 49116 1727204702.69185: done checking for any_errors_fatal 49116 1727204702.69186: checking for max_fail_percentage 49116 1727204702.69188: done checking for max_fail_percentage 49116 1727204702.69189: checking to see if all hosts have failed and the running result is not ok 49116 1727204702.69190: done checking to see if all hosts have failed 49116 1727204702.69191: getting the remaining hosts for this loop 49116 1727204702.69192: done getting the remaining hosts for this loop 49116 1727204702.69197: getting the next task for host managed-node3 49116 1727204702.69206: done getting next task for host managed-node3 49116 1727204702.69209: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 49116 1727204702.69213: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204702.69220: getting variables 49116 1727204702.69222: in VariableManager get_vars() 49116 1727204702.69272: Calling all_inventory to load vars for managed-node3 49116 1727204702.69275: Calling groups_inventory to load vars for managed-node3 49116 1727204702.69278: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204702.69295: Calling all_plugins_play to load vars for managed-node3 49116 1727204702.69299: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204702.69302: Calling groups_plugins_play to load vars for managed-node3 49116 1727204702.79024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204702.82429: done with get_vars() 49116 1727204702.82685: done getting variables 49116 1727204702.82744: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204702.82857: variable 'profile' from source: include params 49116 1727204702.82860: variable 'item' from source: include params 49116 1727204702.83153: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-lsr101.90] *********************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:05:02 -0400 (0:00:00.158) 0:00:25.856 ***** 49116 1727204702.83189: entering _queue_task() for managed-node3/set_fact 49116 1727204702.84114: worker is 1 (out of 1 available) 49116 1727204702.84128: exiting _queue_task() for managed-node3/set_fact 49116 1727204702.84142: done queuing things up, now waiting for results queue to drain 49116 1727204702.84144: waiting for pending results... 49116 1727204702.84516: running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-lsr101.90 49116 1727204702.84692: in run() - task 127b8e07-fff9-02f7-957b-000000000841 49116 1727204702.84698: variable 'ansible_search_path' from source: unknown 49116 1727204702.84701: variable 'ansible_search_path' from source: unknown 49116 1727204702.84710: calling self._execute() 49116 1727204702.84969: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204702.84973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204702.84976: variable 'omit' from source: magic vars 49116 1727204702.85375: variable 'ansible_distribution_major_version' from source: facts 49116 1727204702.85380: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204702.85431: variable 'profile_stat' from source: set_fact 49116 1727204702.85450: Evaluated conditional (profile_stat.stat.exists): False 49116 1727204702.85454: when evaluation is False, skipping this task 49116 1727204702.85458: _execute() done 49116 1727204702.85461: dumping result to json 49116 1727204702.85464: done dumping result, returning 49116 1727204702.85472: done running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-lsr101.90 [127b8e07-fff9-02f7-957b-000000000841] 49116 1727204702.85479: sending task result for task 127b8e07-fff9-02f7-957b-000000000841 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 49116 1727204702.85820: no more pending results, returning what we have 49116 1727204702.85824: results queue empty 49116 1727204702.85825: checking for any_errors_fatal 49116 1727204702.85830: done checking for any_errors_fatal 49116 1727204702.85831: checking for max_fail_percentage 49116 1727204702.85833: done checking for max_fail_percentage 49116 1727204702.85834: checking to see if all hosts have failed and the running result is not ok 49116 1727204702.85835: done checking to see if all hosts have failed 49116 1727204702.85835: getting the remaining hosts for this loop 49116 1727204702.85837: done getting the remaining hosts for this loop 49116 1727204702.85841: getting the next task for host managed-node3 49116 1727204702.85850: done getting next task for host managed-node3 49116 1727204702.85853: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 49116 1727204702.85856: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204702.85861: getting variables 49116 1727204702.85863: in VariableManager get_vars() 49116 1727204702.85906: Calling all_inventory to load vars for managed-node3 49116 1727204702.85909: Calling groups_inventory to load vars for managed-node3 49116 1727204702.85912: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204702.85925: Calling all_plugins_play to load vars for managed-node3 49116 1727204702.85928: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204702.85933: Calling groups_plugins_play to load vars for managed-node3 49116 1727204702.86672: done sending task result for task 127b8e07-fff9-02f7-957b-000000000841 49116 1727204702.86676: WORKER PROCESS EXITING 49116 1727204702.88061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204702.90410: done with get_vars() 49116 1727204702.90446: done getting variables 49116 1727204702.90515: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204702.90648: variable 'profile' from source: include params 49116 1727204702.90652: variable 'item' from source: include params 49116 1727204702.90715: variable 'item' from source: include params TASK [Assert that the profile is present - 'lsr101.90'] ************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 15:05:02 -0400 (0:00:00.075) 0:00:25.932 ***** 49116 1727204702.90749: entering _queue_task() for managed-node3/assert 49116 1727204702.91138: worker is 1 (out of 1 available) 49116 1727204702.91151: exiting _queue_task() for managed-node3/assert 49116 1727204702.91164: done queuing things up, now waiting for results queue to drain 49116 1727204702.91369: waiting for pending results... 49116 1727204702.91486: running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'lsr101.90' 49116 1727204702.91602: in run() - task 127b8e07-fff9-02f7-957b-0000000006c0 49116 1727204702.91626: variable 'ansible_search_path' from source: unknown 49116 1727204702.91630: variable 'ansible_search_path' from source: unknown 49116 1727204702.91672: calling self._execute() 49116 1727204702.91792: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204702.91796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204702.91808: variable 'omit' from source: magic vars 49116 1727204702.92232: variable 'ansible_distribution_major_version' from source: facts 49116 1727204702.92243: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204702.92250: variable 'omit' from source: magic vars 49116 1727204702.92301: variable 'omit' from source: magic vars 49116 1727204702.92571: variable 'profile' from source: include params 49116 1727204702.92575: variable 'item' from source: include params 49116 1727204702.92578: variable 'item' from source: include params 49116 1727204702.92582: variable 'omit' from source: magic vars 49116 1727204702.92585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204702.92594: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204702.92616: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204702.92638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204702.92649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204702.92682: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204702.92685: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204702.92688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204702.92797: Set connection var ansible_connection to ssh 49116 1727204702.92822: Set connection var ansible_timeout to 10 49116 1727204702.92834: Set connection var ansible_shell_executable to /bin/sh 49116 1727204702.92845: Set connection var ansible_pipelining to False 49116 1727204702.92851: Set connection var ansible_shell_type to sh 49116 1727204702.92861: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204702.92892: variable 'ansible_shell_executable' from source: unknown 49116 1727204702.92899: variable 'ansible_connection' from source: unknown 49116 1727204702.92906: variable 'ansible_module_compression' from source: unknown 49116 1727204702.92916: variable 'ansible_shell_type' from source: unknown 49116 1727204702.92923: variable 'ansible_shell_executable' from source: unknown 49116 1727204702.92929: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204702.92937: variable 'ansible_pipelining' from source: unknown 49116 1727204702.92944: variable 'ansible_timeout' from source: unknown 49116 1727204702.92951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204702.93112: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204702.93134: variable 'omit' from source: magic vars 49116 1727204702.93146: starting attempt loop 49116 1727204702.93153: running the handler 49116 1727204702.93289: variable 'lsr_net_profile_exists' from source: set_fact 49116 1727204702.93301: Evaluated conditional (lsr_net_profile_exists): True 49116 1727204702.93312: handler run complete 49116 1727204702.93335: attempt loop complete, returning result 49116 1727204702.93343: _execute() done 49116 1727204702.93357: dumping result to json 49116 1727204702.93364: done dumping result, returning 49116 1727204702.93379: done running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'lsr101.90' [127b8e07-fff9-02f7-957b-0000000006c0] 49116 1727204702.93462: sending task result for task 127b8e07-fff9-02f7-957b-0000000006c0 49116 1727204702.93539: done sending task result for task 127b8e07-fff9-02f7-957b-0000000006c0 49116 1727204702.93542: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 49116 1727204702.93623: no more pending results, returning what we have 49116 1727204702.93627: results queue empty 49116 1727204702.93629: checking for any_errors_fatal 49116 1727204702.93636: done checking for any_errors_fatal 49116 1727204702.93637: checking for max_fail_percentage 49116 1727204702.93639: done checking for max_fail_percentage 49116 1727204702.93640: checking to see if all hosts have failed and the running result is not ok 49116 1727204702.93641: done checking to see if all hosts have failed 49116 1727204702.93642: getting the remaining hosts for this loop 49116 1727204702.93643: done getting the remaining hosts for this loop 49116 1727204702.93648: getting the next task for host managed-node3 49116 1727204702.93656: done getting next task for host managed-node3 49116 1727204702.93659: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 49116 1727204702.93663: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204702.93669: getting variables 49116 1727204702.93671: in VariableManager get_vars() 49116 1727204702.93720: Calling all_inventory to load vars for managed-node3 49116 1727204702.93723: Calling groups_inventory to load vars for managed-node3 49116 1727204702.93726: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204702.93741: Calling all_plugins_play to load vars for managed-node3 49116 1727204702.93744: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204702.93748: Calling groups_plugins_play to load vars for managed-node3 49116 1727204702.95863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204702.99277: done with get_vars() 49116 1727204702.99320: done getting variables 49116 1727204702.99392: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204702.99747: variable 'profile' from source: include params 49116 1727204702.99752: variable 'item' from source: include params 49116 1727204702.99825: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'lsr101.90'] ******* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 15:05:02 -0400 (0:00:00.091) 0:00:26.023 ***** 49116 1727204703.00021: entering _queue_task() for managed-node3/assert 49116 1727204703.00722: worker is 1 (out of 1 available) 49116 1727204703.00738: exiting _queue_task() for managed-node3/assert 49116 1727204703.00752: done queuing things up, now waiting for results queue to drain 49116 1727204703.00753: waiting for pending results... 49116 1727204703.01691: running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'lsr101.90' 49116 1727204703.01753: in run() - task 127b8e07-fff9-02f7-957b-0000000006c1 49116 1727204703.01819: variable 'ansible_search_path' from source: unknown 49116 1727204703.01846: variable 'ansible_search_path' from source: unknown 49116 1727204703.02030: calling self._execute() 49116 1727204703.02296: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204703.02308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204703.02493: variable 'omit' from source: magic vars 49116 1727204703.03030: variable 'ansible_distribution_major_version' from source: facts 49116 1727204703.03053: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204703.03066: variable 'omit' from source: magic vars 49116 1727204703.03148: variable 'omit' from source: magic vars 49116 1727204703.03236: variable 'profile' from source: include params 49116 1727204703.03246: variable 'item' from source: include params 49116 1727204703.03324: variable 'item' from source: include params 49116 1727204703.03349: variable 'omit' from source: magic vars 49116 1727204703.03405: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204703.03472: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204703.03484: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204703.03508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204703.03583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204703.03586: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204703.03589: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204703.03592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204703.03695: Set connection var ansible_connection to ssh 49116 1727204703.03715: Set connection var ansible_timeout to 10 49116 1727204703.03727: Set connection var ansible_shell_executable to /bin/sh 49116 1727204703.03737: Set connection var ansible_pipelining to False 49116 1727204703.03744: Set connection var ansible_shell_type to sh 49116 1727204703.03753: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204703.03786: variable 'ansible_shell_executable' from source: unknown 49116 1727204703.03800: variable 'ansible_connection' from source: unknown 49116 1727204703.03807: variable 'ansible_module_compression' from source: unknown 49116 1727204703.03814: variable 'ansible_shell_type' from source: unknown 49116 1727204703.03910: variable 'ansible_shell_executable' from source: unknown 49116 1727204703.03913: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204703.03915: variable 'ansible_pipelining' from source: unknown 49116 1727204703.03918: variable 'ansible_timeout' from source: unknown 49116 1727204703.03920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204703.04003: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204703.04028: variable 'omit' from source: magic vars 49116 1727204703.04041: starting attempt loop 49116 1727204703.04050: running the handler 49116 1727204703.04188: variable 'lsr_net_profile_ansible_managed' from source: set_fact 49116 1727204703.04199: Evaluated conditional (lsr_net_profile_ansible_managed): True 49116 1727204703.04210: handler run complete 49116 1727204703.04232: attempt loop complete, returning result 49116 1727204703.04242: _execute() done 49116 1727204703.04250: dumping result to json 49116 1727204703.04257: done dumping result, returning 49116 1727204703.04271: done running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'lsr101.90' [127b8e07-fff9-02f7-957b-0000000006c1] 49116 1727204703.04283: sending task result for task 127b8e07-fff9-02f7-957b-0000000006c1 49116 1727204703.04671: done sending task result for task 127b8e07-fff9-02f7-957b-0000000006c1 49116 1727204703.04675: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 49116 1727204703.04721: no more pending results, returning what we have 49116 1727204703.04724: results queue empty 49116 1727204703.04725: checking for any_errors_fatal 49116 1727204703.04731: done checking for any_errors_fatal 49116 1727204703.04732: checking for max_fail_percentage 49116 1727204703.04734: done checking for max_fail_percentage 49116 1727204703.04735: checking to see if all hosts have failed and the running result is not ok 49116 1727204703.04736: done checking to see if all hosts have failed 49116 1727204703.04736: getting the remaining hosts for this loop 49116 1727204703.04738: done getting the remaining hosts for this loop 49116 1727204703.04741: getting the next task for host managed-node3 49116 1727204703.04748: done getting next task for host managed-node3 49116 1727204703.04751: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 49116 1727204703.04754: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204703.04759: getting variables 49116 1727204703.04760: in VariableManager get_vars() 49116 1727204703.04803: Calling all_inventory to load vars for managed-node3 49116 1727204703.04806: Calling groups_inventory to load vars for managed-node3 49116 1727204703.04808: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204703.04821: Calling all_plugins_play to load vars for managed-node3 49116 1727204703.04824: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204703.04827: Calling groups_plugins_play to load vars for managed-node3 49116 1727204703.06782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204703.08976: done with get_vars() 49116 1727204703.09013: done getting variables 49116 1727204703.09083: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204703.09204: variable 'profile' from source: include params 49116 1727204703.09209: variable 'item' from source: include params 49116 1727204703.09273: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in lsr101.90] ************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 15:05:03 -0400 (0:00:00.094) 0:00:26.118 ***** 49116 1727204703.09311: entering _queue_task() for managed-node3/assert 49116 1727204703.09695: worker is 1 (out of 1 available) 49116 1727204703.09710: exiting _queue_task() for managed-node3/assert 49116 1727204703.09722: done queuing things up, now waiting for results queue to drain 49116 1727204703.09724: waiting for pending results... 49116 1727204703.10033: running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in lsr101.90 49116 1727204703.10157: in run() - task 127b8e07-fff9-02f7-957b-0000000006c2 49116 1727204703.10187: variable 'ansible_search_path' from source: unknown 49116 1727204703.10198: variable 'ansible_search_path' from source: unknown 49116 1727204703.10242: calling self._execute() 49116 1727204703.10356: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204703.10370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204703.10384: variable 'omit' from source: magic vars 49116 1727204703.10808: variable 'ansible_distribution_major_version' from source: facts 49116 1727204703.10847: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204703.10850: variable 'omit' from source: magic vars 49116 1727204703.10894: variable 'omit' from source: magic vars 49116 1727204703.11170: variable 'profile' from source: include params 49116 1727204703.11174: variable 'item' from source: include params 49116 1727204703.11176: variable 'item' from source: include params 49116 1727204703.11178: variable 'omit' from source: magic vars 49116 1727204703.11180: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204703.11200: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204703.11224: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204703.11246: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204703.11260: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204703.11303: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204703.11312: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204703.11319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204703.11430: Set connection var ansible_connection to ssh 49116 1727204703.11447: Set connection var ansible_timeout to 10 49116 1727204703.11457: Set connection var ansible_shell_executable to /bin/sh 49116 1727204703.11467: Set connection var ansible_pipelining to False 49116 1727204703.11474: Set connection var ansible_shell_type to sh 49116 1727204703.11482: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204703.11512: variable 'ansible_shell_executable' from source: unknown 49116 1727204703.11519: variable 'ansible_connection' from source: unknown 49116 1727204703.11525: variable 'ansible_module_compression' from source: unknown 49116 1727204703.11530: variable 'ansible_shell_type' from source: unknown 49116 1727204703.11535: variable 'ansible_shell_executable' from source: unknown 49116 1727204703.11541: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204703.11547: variable 'ansible_pipelining' from source: unknown 49116 1727204703.11552: variable 'ansible_timeout' from source: unknown 49116 1727204703.11558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204703.11972: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204703.11976: variable 'omit' from source: magic vars 49116 1727204703.11979: starting attempt loop 49116 1727204703.11981: running the handler 49116 1727204703.12372: variable 'lsr_net_profile_fingerprint' from source: set_fact 49116 1727204703.12376: Evaluated conditional (lsr_net_profile_fingerprint): True 49116 1727204703.12378: handler run complete 49116 1727204703.12381: attempt loop complete, returning result 49116 1727204703.12383: _execute() done 49116 1727204703.12385: dumping result to json 49116 1727204703.12388: done dumping result, returning 49116 1727204703.12390: done running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in lsr101.90 [127b8e07-fff9-02f7-957b-0000000006c2] 49116 1727204703.12392: sending task result for task 127b8e07-fff9-02f7-957b-0000000006c2 49116 1727204703.12472: done sending task result for task 127b8e07-fff9-02f7-957b-0000000006c2 49116 1727204703.12475: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 49116 1727204703.12535: no more pending results, returning what we have 49116 1727204703.12539: results queue empty 49116 1727204703.12540: checking for any_errors_fatal 49116 1727204703.12550: done checking for any_errors_fatal 49116 1727204703.12551: checking for max_fail_percentage 49116 1727204703.12554: done checking for max_fail_percentage 49116 1727204703.12555: checking to see if all hosts have failed and the running result is not ok 49116 1727204703.12556: done checking to see if all hosts have failed 49116 1727204703.12557: getting the remaining hosts for this loop 49116 1727204703.12558: done getting the remaining hosts for this loop 49116 1727204703.12564: getting the next task for host managed-node3 49116 1727204703.12575: done getting next task for host managed-node3 49116 1727204703.12579: ^ task is: TASK: TEARDOWN: remove profiles. 49116 1727204703.12582: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204703.12588: getting variables 49116 1727204703.12590: in VariableManager get_vars() 49116 1727204703.12641: Calling all_inventory to load vars for managed-node3 49116 1727204703.12645: Calling groups_inventory to load vars for managed-node3 49116 1727204703.12648: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204703.12662: Calling all_plugins_play to load vars for managed-node3 49116 1727204703.12813: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204703.12821: Calling groups_plugins_play to load vars for managed-node3 49116 1727204703.16816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204703.19412: done with get_vars() 49116 1727204703.19444: done getting variables 49116 1727204703.19514: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:58 Tuesday 24 September 2024 15:05:03 -0400 (0:00:00.102) 0:00:26.220 ***** 49116 1727204703.19545: entering _queue_task() for managed-node3/debug 49116 1727204703.19935: worker is 1 (out of 1 available) 49116 1727204703.19952: exiting _queue_task() for managed-node3/debug 49116 1727204703.20169: done queuing things up, now waiting for results queue to drain 49116 1727204703.20171: waiting for pending results... 49116 1727204703.20287: running TaskExecutor() for managed-node3/TASK: TEARDOWN: remove profiles. 49116 1727204703.20406: in run() - task 127b8e07-fff9-02f7-957b-00000000005d 49116 1727204703.20431: variable 'ansible_search_path' from source: unknown 49116 1727204703.20482: calling self._execute() 49116 1727204703.20606: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204703.20732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204703.20736: variable 'omit' from source: magic vars 49116 1727204703.21068: variable 'ansible_distribution_major_version' from source: facts 49116 1727204703.21087: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204703.21099: variable 'omit' from source: magic vars 49116 1727204703.21125: variable 'omit' from source: magic vars 49116 1727204703.21176: variable 'omit' from source: magic vars 49116 1727204703.21227: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204703.21279: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204703.21307: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204703.21331: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204703.21348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204703.21388: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204703.21397: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204703.21404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204703.21520: Set connection var ansible_connection to ssh 49116 1727204703.21540: Set connection var ansible_timeout to 10 49116 1727204703.21553: Set connection var ansible_shell_executable to /bin/sh 49116 1727204703.21567: Set connection var ansible_pipelining to False 49116 1727204703.21574: Set connection var ansible_shell_type to sh 49116 1727204703.21586: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204703.21623: variable 'ansible_shell_executable' from source: unknown 49116 1727204703.21708: variable 'ansible_connection' from source: unknown 49116 1727204703.21712: variable 'ansible_module_compression' from source: unknown 49116 1727204703.21714: variable 'ansible_shell_type' from source: unknown 49116 1727204703.21717: variable 'ansible_shell_executable' from source: unknown 49116 1727204703.21719: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204703.21721: variable 'ansible_pipelining' from source: unknown 49116 1727204703.21723: variable 'ansible_timeout' from source: unknown 49116 1727204703.21726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204703.21827: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204703.21845: variable 'omit' from source: magic vars 49116 1727204703.21854: starting attempt loop 49116 1727204703.21859: running the handler 49116 1727204703.21910: handler run complete 49116 1727204703.21937: attempt loop complete, returning result 49116 1727204703.21943: _execute() done 49116 1727204703.21950: dumping result to json 49116 1727204703.21955: done dumping result, returning 49116 1727204703.21967: done running TaskExecutor() for managed-node3/TASK: TEARDOWN: remove profiles. [127b8e07-fff9-02f7-957b-00000000005d] 49116 1727204703.21976: sending task result for task 127b8e07-fff9-02f7-957b-00000000005d 49116 1727204703.22231: done sending task result for task 127b8e07-fff9-02f7-957b-00000000005d 49116 1727204703.22234: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: ################################################## 49116 1727204703.22292: no more pending results, returning what we have 49116 1727204703.22295: results queue empty 49116 1727204703.22296: checking for any_errors_fatal 49116 1727204703.22303: done checking for any_errors_fatal 49116 1727204703.22304: checking for max_fail_percentage 49116 1727204703.22306: done checking for max_fail_percentage 49116 1727204703.22307: checking to see if all hosts have failed and the running result is not ok 49116 1727204703.22308: done checking to see if all hosts have failed 49116 1727204703.22309: getting the remaining hosts for this loop 49116 1727204703.22310: done getting the remaining hosts for this loop 49116 1727204703.22315: getting the next task for host managed-node3 49116 1727204703.22323: done getting next task for host managed-node3 49116 1727204703.22329: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 49116 1727204703.22332: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204703.22355: getting variables 49116 1727204703.22357: in VariableManager get_vars() 49116 1727204703.22412: Calling all_inventory to load vars for managed-node3 49116 1727204703.22417: Calling groups_inventory to load vars for managed-node3 49116 1727204703.22420: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204703.22434: Calling all_plugins_play to load vars for managed-node3 49116 1727204703.22437: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204703.22441: Calling groups_plugins_play to load vars for managed-node3 49116 1727204703.24381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204703.26560: done with get_vars() 49116 1727204703.26601: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:05:03 -0400 (0:00:00.071) 0:00:26.292 ***** 49116 1727204703.26710: entering _queue_task() for managed-node3/include_tasks 49116 1727204703.27322: worker is 1 (out of 1 available) 49116 1727204703.27334: exiting _queue_task() for managed-node3/include_tasks 49116 1727204703.27348: done queuing things up, now waiting for results queue to drain 49116 1727204703.27350: waiting for pending results... 49116 1727204703.27918: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 49116 1727204703.28455: in run() - task 127b8e07-fff9-02f7-957b-000000000065 49116 1727204703.28460: variable 'ansible_search_path' from source: unknown 49116 1727204703.28463: variable 'ansible_search_path' from source: unknown 49116 1727204703.28671: calling self._execute() 49116 1727204703.28802: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204703.28848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204703.28869: variable 'omit' from source: magic vars 49116 1727204703.29370: variable 'ansible_distribution_major_version' from source: facts 49116 1727204703.29396: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204703.29411: _execute() done 49116 1727204703.29420: dumping result to json 49116 1727204703.29436: done dumping result, returning 49116 1727204703.29449: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-02f7-957b-000000000065] 49116 1727204703.29459: sending task result for task 127b8e07-fff9-02f7-957b-000000000065 49116 1727204703.29700: done sending task result for task 127b8e07-fff9-02f7-957b-000000000065 49116 1727204703.29705: WORKER PROCESS EXITING 49116 1727204703.29757: no more pending results, returning what we have 49116 1727204703.29763: in VariableManager get_vars() 49116 1727204703.29815: Calling all_inventory to load vars for managed-node3 49116 1727204703.29818: Calling groups_inventory to load vars for managed-node3 49116 1727204703.29820: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204703.29834: Calling all_plugins_play to load vars for managed-node3 49116 1727204703.29837: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204703.29840: Calling groups_plugins_play to load vars for managed-node3 49116 1727204703.32178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204703.34670: done with get_vars() 49116 1727204703.34706: variable 'ansible_search_path' from source: unknown 49116 1727204703.34707: variable 'ansible_search_path' from source: unknown 49116 1727204703.34753: we have included files to process 49116 1727204703.34754: generating all_blocks data 49116 1727204703.34756: done generating all_blocks data 49116 1727204703.34761: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 49116 1727204703.34762: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 49116 1727204703.34765: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 49116 1727204703.35500: done processing included file 49116 1727204703.35503: iterating over new_blocks loaded from include file 49116 1727204703.35504: in VariableManager get_vars() 49116 1727204703.35534: done with get_vars() 49116 1727204703.35536: filtering new block on tags 49116 1727204703.35557: done filtering new block on tags 49116 1727204703.35560: in VariableManager get_vars() 49116 1727204703.35587: done with get_vars() 49116 1727204703.35590: filtering new block on tags 49116 1727204703.35612: done filtering new block on tags 49116 1727204703.35615: in VariableManager get_vars() 49116 1727204703.35642: done with get_vars() 49116 1727204703.35645: filtering new block on tags 49116 1727204703.35665: done filtering new block on tags 49116 1727204703.35675: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 49116 1727204703.35682: extending task lists for all hosts with included blocks 49116 1727204703.36725: done extending task lists 49116 1727204703.36727: done processing included files 49116 1727204703.36727: results queue empty 49116 1727204703.36728: checking for any_errors_fatal 49116 1727204703.36732: done checking for any_errors_fatal 49116 1727204703.36733: checking for max_fail_percentage 49116 1727204703.36734: done checking for max_fail_percentage 49116 1727204703.36735: checking to see if all hosts have failed and the running result is not ok 49116 1727204703.36736: done checking to see if all hosts have failed 49116 1727204703.36737: getting the remaining hosts for this loop 49116 1727204703.36738: done getting the remaining hosts for this loop 49116 1727204703.36741: getting the next task for host managed-node3 49116 1727204703.36745: done getting next task for host managed-node3 49116 1727204703.36749: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 49116 1727204703.36752: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204703.36764: getting variables 49116 1727204703.36767: in VariableManager get_vars() 49116 1727204703.36788: Calling all_inventory to load vars for managed-node3 49116 1727204703.36790: Calling groups_inventory to load vars for managed-node3 49116 1727204703.36792: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204703.36799: Calling all_plugins_play to load vars for managed-node3 49116 1727204703.36801: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204703.36804: Calling groups_plugins_play to load vars for managed-node3 49116 1727204703.40223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204703.43254: done with get_vars() 49116 1727204703.43503: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:05:03 -0400 (0:00:00.168) 0:00:26.460 ***** 49116 1727204703.43600: entering _queue_task() for managed-node3/setup 49116 1727204703.44391: worker is 1 (out of 1 available) 49116 1727204703.44405: exiting _queue_task() for managed-node3/setup 49116 1727204703.44422: done queuing things up, now waiting for results queue to drain 49116 1727204703.44423: waiting for pending results... 49116 1727204703.44758: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 49116 1727204703.44959: in run() - task 127b8e07-fff9-02f7-957b-000000000883 49116 1727204703.44990: variable 'ansible_search_path' from source: unknown 49116 1727204703.45071: variable 'ansible_search_path' from source: unknown 49116 1727204703.45076: calling self._execute() 49116 1727204703.45170: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204703.45183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204703.45198: variable 'omit' from source: magic vars 49116 1727204703.45647: variable 'ansible_distribution_major_version' from source: facts 49116 1727204703.45670: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204703.46280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204703.49548: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204703.49652: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204703.49711: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204703.49756: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204703.49792: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204703.49992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204703.49995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204703.49998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204703.50019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204703.50036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204703.50108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204703.50145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204703.50173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204703.50217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204703.50244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204703.50442: variable '__network_required_facts' from source: role '' defaults 49116 1727204703.50459: variable 'ansible_facts' from source: unknown 49116 1727204703.51516: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 49116 1727204703.51521: when evaluation is False, skipping this task 49116 1727204703.51524: _execute() done 49116 1727204703.51526: dumping result to json 49116 1727204703.51532: done dumping result, returning 49116 1727204703.51551: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-02f7-957b-000000000883] 49116 1727204703.51554: sending task result for task 127b8e07-fff9-02f7-957b-000000000883 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 49116 1727204703.51715: no more pending results, returning what we have 49116 1727204703.51720: results queue empty 49116 1727204703.51721: checking for any_errors_fatal 49116 1727204703.51723: done checking for any_errors_fatal 49116 1727204703.51724: checking for max_fail_percentage 49116 1727204703.51727: done checking for max_fail_percentage 49116 1727204703.51728: checking to see if all hosts have failed and the running result is not ok 49116 1727204703.51729: done checking to see if all hosts have failed 49116 1727204703.51730: getting the remaining hosts for this loop 49116 1727204703.51732: done getting the remaining hosts for this loop 49116 1727204703.51743: getting the next task for host managed-node3 49116 1727204703.51755: done getting next task for host managed-node3 49116 1727204703.51759: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 49116 1727204703.51764: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204703.51793: getting variables 49116 1727204703.51795: in VariableManager get_vars() 49116 1727204703.51962: Calling all_inventory to load vars for managed-node3 49116 1727204703.51967: Calling groups_inventory to load vars for managed-node3 49116 1727204703.51971: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204703.51978: done sending task result for task 127b8e07-fff9-02f7-957b-000000000883 49116 1727204703.51982: WORKER PROCESS EXITING 49116 1727204703.52078: Calling all_plugins_play to load vars for managed-node3 49116 1727204703.52082: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204703.52086: Calling groups_plugins_play to load vars for managed-node3 49116 1727204703.54078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204703.56437: done with get_vars() 49116 1727204703.56481: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:05:03 -0400 (0:00:00.129) 0:00:26.591 ***** 49116 1727204703.56608: entering _queue_task() for managed-node3/stat 49116 1727204703.57064: worker is 1 (out of 1 available) 49116 1727204703.57080: exiting _queue_task() for managed-node3/stat 49116 1727204703.57095: done queuing things up, now waiting for results queue to drain 49116 1727204703.57096: waiting for pending results... 49116 1727204703.57409: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 49116 1727204703.57590: in run() - task 127b8e07-fff9-02f7-957b-000000000885 49116 1727204703.57772: variable 'ansible_search_path' from source: unknown 49116 1727204703.57776: variable 'ansible_search_path' from source: unknown 49116 1727204703.57780: calling self._execute() 49116 1727204703.57782: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204703.57785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204703.57794: variable 'omit' from source: magic vars 49116 1727204703.58259: variable 'ansible_distribution_major_version' from source: facts 49116 1727204703.58269: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204703.58489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49116 1727204703.58824: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49116 1727204703.58882: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49116 1727204703.58926: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49116 1727204703.58964: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49116 1727204703.59071: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49116 1727204703.59103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49116 1727204703.59140: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204703.59169: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49116 1727204703.59289: variable '__network_is_ostree' from source: set_fact 49116 1727204703.59297: Evaluated conditional (not __network_is_ostree is defined): False 49116 1727204703.59301: when evaluation is False, skipping this task 49116 1727204703.59304: _execute() done 49116 1727204703.59313: dumping result to json 49116 1727204703.59316: done dumping result, returning 49116 1727204703.59325: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-02f7-957b-000000000885] 49116 1727204703.59341: sending task result for task 127b8e07-fff9-02f7-957b-000000000885 skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 49116 1727204703.59824: no more pending results, returning what we have 49116 1727204703.59828: results queue empty 49116 1727204703.59829: checking for any_errors_fatal 49116 1727204703.59838: done checking for any_errors_fatal 49116 1727204703.59840: checking for max_fail_percentage 49116 1727204703.59842: done checking for max_fail_percentage 49116 1727204703.59843: checking to see if all hosts have failed and the running result is not ok 49116 1727204703.59844: done checking to see if all hosts have failed 49116 1727204703.59844: getting the remaining hosts for this loop 49116 1727204703.59846: done getting the remaining hosts for this loop 49116 1727204703.59857: getting the next task for host managed-node3 49116 1727204703.59864: done getting next task for host managed-node3 49116 1727204703.59872: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 49116 1727204703.59875: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204703.59885: done sending task result for task 127b8e07-fff9-02f7-957b-000000000885 49116 1727204703.59888: WORKER PROCESS EXITING 49116 1727204703.59899: getting variables 49116 1727204703.59901: in VariableManager get_vars() 49116 1727204703.59941: Calling all_inventory to load vars for managed-node3 49116 1727204703.59944: Calling groups_inventory to load vars for managed-node3 49116 1727204703.59946: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204703.59960: Calling all_plugins_play to load vars for managed-node3 49116 1727204703.59963: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204703.59966: Calling groups_plugins_play to load vars for managed-node3 49116 1727204703.61962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204703.64364: done with get_vars() 49116 1727204703.64410: done getting variables 49116 1727204703.64489: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:05:03 -0400 (0:00:00.079) 0:00:26.670 ***** 49116 1727204703.64531: entering _queue_task() for managed-node3/set_fact 49116 1727204703.64954: worker is 1 (out of 1 available) 49116 1727204703.65073: exiting _queue_task() for managed-node3/set_fact 49116 1727204703.65087: done queuing things up, now waiting for results queue to drain 49116 1727204703.65089: waiting for pending results... 49116 1727204703.65383: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 49116 1727204703.65605: in run() - task 127b8e07-fff9-02f7-957b-000000000886 49116 1727204703.65637: variable 'ansible_search_path' from source: unknown 49116 1727204703.65659: variable 'ansible_search_path' from source: unknown 49116 1727204703.65771: calling self._execute() 49116 1727204703.65849: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204703.65877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204703.65913: variable 'omit' from source: magic vars 49116 1727204703.66410: variable 'ansible_distribution_major_version' from source: facts 49116 1727204703.66448: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204703.66673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49116 1727204703.67005: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49116 1727204703.67074: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49116 1727204703.67183: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49116 1727204703.67187: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49116 1727204703.67262: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49116 1727204703.67306: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49116 1727204703.67343: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204703.67377: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49116 1727204703.67532: variable '__network_is_ostree' from source: set_fact 49116 1727204703.67618: Evaluated conditional (not __network_is_ostree is defined): False 49116 1727204703.67621: when evaluation is False, skipping this task 49116 1727204703.67623: _execute() done 49116 1727204703.67627: dumping result to json 49116 1727204703.67629: done dumping result, returning 49116 1727204703.67635: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-02f7-957b-000000000886] 49116 1727204703.67637: sending task result for task 127b8e07-fff9-02f7-957b-000000000886 skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 49116 1727204703.67920: no more pending results, returning what we have 49116 1727204703.67925: results queue empty 49116 1727204703.67926: checking for any_errors_fatal 49116 1727204703.67939: done checking for any_errors_fatal 49116 1727204703.67940: checking for max_fail_percentage 49116 1727204703.67942: done checking for max_fail_percentage 49116 1727204703.67943: checking to see if all hosts have failed and the running result is not ok 49116 1727204703.67944: done checking to see if all hosts have failed 49116 1727204703.67945: getting the remaining hosts for this loop 49116 1727204703.67947: done getting the remaining hosts for this loop 49116 1727204703.67953: getting the next task for host managed-node3 49116 1727204703.67967: done getting next task for host managed-node3 49116 1727204703.67971: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 49116 1727204703.67976: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204703.68002: getting variables 49116 1727204703.68004: in VariableManager get_vars() 49116 1727204703.68057: Calling all_inventory to load vars for managed-node3 49116 1727204703.68060: Calling groups_inventory to load vars for managed-node3 49116 1727204703.68062: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204703.68279: Calling all_plugins_play to load vars for managed-node3 49116 1727204703.68283: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204703.68288: Calling groups_plugins_play to load vars for managed-node3 49116 1727204703.68828: done sending task result for task 127b8e07-fff9-02f7-957b-000000000886 49116 1727204703.68835: WORKER PROCESS EXITING 49116 1727204703.71883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204703.76603: done with get_vars() 49116 1727204703.76652: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:05:03 -0400 (0:00:00.122) 0:00:26.792 ***** 49116 1727204703.76970: entering _queue_task() for managed-node3/service_facts 49116 1727204703.77624: worker is 1 (out of 1 available) 49116 1727204703.77642: exiting _queue_task() for managed-node3/service_facts 49116 1727204703.77657: done queuing things up, now waiting for results queue to drain 49116 1727204703.77658: waiting for pending results... 49116 1727204703.78275: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 49116 1727204703.78653: in run() - task 127b8e07-fff9-02f7-957b-000000000888 49116 1727204703.78669: variable 'ansible_search_path' from source: unknown 49116 1727204703.78675: variable 'ansible_search_path' from source: unknown 49116 1727204703.78713: calling self._execute() 49116 1727204703.78920: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204703.78927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204703.78939: variable 'omit' from source: magic vars 49116 1727204703.79845: variable 'ansible_distribution_major_version' from source: facts 49116 1727204703.79857: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204703.79865: variable 'omit' from source: magic vars 49116 1727204703.79956: variable 'omit' from source: magic vars 49116 1727204703.80236: variable 'omit' from source: magic vars 49116 1727204703.80250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204703.80293: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204703.80342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204703.80346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204703.80348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204703.80482: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204703.80486: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204703.80489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204703.80600: Set connection var ansible_connection to ssh 49116 1727204703.80669: Set connection var ansible_timeout to 10 49116 1727204703.80672: Set connection var ansible_shell_executable to /bin/sh 49116 1727204703.80676: Set connection var ansible_pipelining to False 49116 1727204703.80678: Set connection var ansible_shell_type to sh 49116 1727204703.80680: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204703.80682: variable 'ansible_shell_executable' from source: unknown 49116 1727204703.80685: variable 'ansible_connection' from source: unknown 49116 1727204703.80972: variable 'ansible_module_compression' from source: unknown 49116 1727204703.80977: variable 'ansible_shell_type' from source: unknown 49116 1727204703.80981: variable 'ansible_shell_executable' from source: unknown 49116 1727204703.80983: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204703.80991: variable 'ansible_pipelining' from source: unknown 49116 1727204703.80993: variable 'ansible_timeout' from source: unknown 49116 1727204703.80995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204703.81427: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49116 1727204703.81432: variable 'omit' from source: magic vars 49116 1727204703.81438: starting attempt loop 49116 1727204703.81440: running the handler 49116 1727204703.81471: _low_level_execute_command(): starting 49116 1727204703.81474: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204703.82994: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204703.83090: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204703.83172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204703.83176: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 49116 1727204703.83179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49116 1727204703.83182: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204703.83185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204703.83302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204703.83311: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204703.83587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204703.83686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204703.85516: stdout chunk (state=3): >>>/root <<< 49116 1727204703.85621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204703.85784: stderr chunk (state=3): >>><<< 49116 1727204703.85788: stdout chunk (state=3): >>><<< 49116 1727204703.85815: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204703.85832: _low_level_execute_command(): starting 49116 1727204703.85841: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204703.8581557-50748-158348723870241 `" && echo ansible-tmp-1727204703.8581557-50748-158348723870241="` echo /root/.ansible/tmp/ansible-tmp-1727204703.8581557-50748-158348723870241 `" ) && sleep 0' 49116 1727204703.87412: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204703.87474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204703.87477: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204703.87860: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204703.87975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204703.90179: stdout chunk (state=3): >>>ansible-tmp-1727204703.8581557-50748-158348723870241=/root/.ansible/tmp/ansible-tmp-1727204703.8581557-50748-158348723870241 <<< 49116 1727204703.90342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204703.90481: stderr chunk (state=3): >>><<< 49116 1727204703.90484: stdout chunk (state=3): >>><<< 49116 1727204703.90503: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204703.8581557-50748-158348723870241=/root/.ansible/tmp/ansible-tmp-1727204703.8581557-50748-158348723870241 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204703.90558: variable 'ansible_module_compression' from source: unknown 49116 1727204703.90609: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 49116 1727204703.90652: variable 'ansible_facts' from source: unknown 49116 1727204703.90946: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204703.8581557-50748-158348723870241/AnsiballZ_service_facts.py 49116 1727204703.91290: Sending initial data 49116 1727204703.91294: Sent initial data (162 bytes) 49116 1727204703.92428: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204703.92472: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204703.92476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204703.92858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204703.92863: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204703.92917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204703.94710: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 49116 1727204703.94715: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204703.94909: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204703.94983: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpnc9fbovr /root/.ansible/tmp/ansible-tmp-1727204703.8581557-50748-158348723870241/AnsiballZ_service_facts.py <<< 49116 1727204703.94987: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204703.8581557-50748-158348723870241/AnsiballZ_service_facts.py" <<< 49116 1727204703.95101: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpnc9fbovr" to remote "/root/.ansible/tmp/ansible-tmp-1727204703.8581557-50748-158348723870241/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204703.8581557-50748-158348723870241/AnsiballZ_service_facts.py" <<< 49116 1727204703.96342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204703.96472: stderr chunk (state=3): >>><<< 49116 1727204703.96477: stdout chunk (state=3): >>><<< 49116 1727204703.96480: done transferring module to remote 49116 1727204703.96483: _low_level_execute_command(): starting 49116 1727204703.96486: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204703.8581557-50748-158348723870241/ /root/.ansible/tmp/ansible-tmp-1727204703.8581557-50748-158348723870241/AnsiballZ_service_facts.py && sleep 0' 49116 1727204703.97096: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204703.97104: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204703.97115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204703.97139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204703.97148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204703.97156: stderr chunk (state=3): >>>debug2: match not found <<< 49116 1727204703.97167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204703.97182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 49116 1727204703.97189: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 49116 1727204703.97257: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 49116 1727204703.97260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204703.97263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204703.97299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204703.97346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204703.97420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204703.99557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204703.99564: stdout chunk (state=3): >>><<< 49116 1727204703.99569: stderr chunk (state=3): >>><<< 49116 1727204703.99673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204703.99676: _low_level_execute_command(): starting 49116 1727204703.99681: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204703.8581557-50748-158348723870241/AnsiballZ_service_facts.py && sleep 0' 49116 1727204704.00318: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204704.00346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204704.00454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204704.00484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204704.00502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204704.00527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204704.00663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204706.44990: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind<<< 49116 1727204706.45080: stdout chunk (state=3): >>>.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "ply<<< 49116 1727204706.45104: stdout chunk (state=3): >>>mouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 49116 1727204706.46943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204706.46959: stderr chunk (state=3): >>>Shared connection to 10.31.45.169 closed. <<< 49116 1727204706.46985: stdout chunk (state=3): >>><<< 49116 1727204706.46988: stderr chunk (state=3): >>><<< 49116 1727204706.47174: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204706.48004: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204703.8581557-50748-158348723870241/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204706.48024: _low_level_execute_command(): starting 49116 1727204706.48034: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204703.8581557-50748-158348723870241/ > /dev/null 2>&1 && sleep 0' 49116 1727204706.48723: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204706.48785: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204706.48853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204706.48878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204706.48903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204706.49011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204706.51261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204706.51277: stdout chunk (state=3): >>><<< 49116 1727204706.51290: stderr chunk (state=3): >>><<< 49116 1727204706.51311: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204706.51472: handler run complete 49116 1727204706.51574: variable 'ansible_facts' from source: unknown 49116 1727204706.51838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204706.52258: variable 'ansible_facts' from source: unknown 49116 1727204706.52361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204706.52523: attempt loop complete, returning result 49116 1727204706.52527: _execute() done 49116 1727204706.52532: dumping result to json 49116 1727204706.52576: done dumping result, returning 49116 1727204706.52593: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-02f7-957b-000000000888] 49116 1727204706.52596: sending task result for task 127b8e07-fff9-02f7-957b-000000000888 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 49116 1727204706.53341: done sending task result for task 127b8e07-fff9-02f7-957b-000000000888 49116 1727204706.53345: WORKER PROCESS EXITING 49116 1727204706.53354: no more pending results, returning what we have 49116 1727204706.53357: results queue empty 49116 1727204706.53357: checking for any_errors_fatal 49116 1727204706.53361: done checking for any_errors_fatal 49116 1727204706.53361: checking for max_fail_percentage 49116 1727204706.53362: done checking for max_fail_percentage 49116 1727204706.53363: checking to see if all hosts have failed and the running result is not ok 49116 1727204706.53364: done checking to see if all hosts have failed 49116 1727204706.53364: getting the remaining hosts for this loop 49116 1727204706.53367: done getting the remaining hosts for this loop 49116 1727204706.53370: getting the next task for host managed-node3 49116 1727204706.53374: done getting next task for host managed-node3 49116 1727204706.53377: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 49116 1727204706.53379: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204706.53392: getting variables 49116 1727204706.53393: in VariableManager get_vars() 49116 1727204706.53420: Calling all_inventory to load vars for managed-node3 49116 1727204706.53422: Calling groups_inventory to load vars for managed-node3 49116 1727204706.53423: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204706.53431: Calling all_plugins_play to load vars for managed-node3 49116 1727204706.53435: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204706.53437: Calling groups_plugins_play to load vars for managed-node3 49116 1727204706.55022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204706.56261: done with get_vars() 49116 1727204706.56296: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:05:06 -0400 (0:00:02.796) 0:00:29.588 ***** 49116 1727204706.56385: entering _queue_task() for managed-node3/package_facts 49116 1727204706.56685: worker is 1 (out of 1 available) 49116 1727204706.56701: exiting _queue_task() for managed-node3/package_facts 49116 1727204706.56716: done queuing things up, now waiting for results queue to drain 49116 1727204706.56717: waiting for pending results... 49116 1727204706.56921: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 49116 1727204706.57040: in run() - task 127b8e07-fff9-02f7-957b-000000000889 49116 1727204706.57053: variable 'ansible_search_path' from source: unknown 49116 1727204706.57057: variable 'ansible_search_path' from source: unknown 49116 1727204706.57091: calling self._execute() 49116 1727204706.57228: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204706.57232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204706.57238: variable 'omit' from source: magic vars 49116 1727204706.57671: variable 'ansible_distribution_major_version' from source: facts 49116 1727204706.57675: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204706.57677: variable 'omit' from source: magic vars 49116 1727204706.57721: variable 'omit' from source: magic vars 49116 1727204706.57764: variable 'omit' from source: magic vars 49116 1727204706.57817: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204706.57864: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204706.57895: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204706.57919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204706.57938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204706.57977: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204706.57986: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204706.57994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204706.58151: Set connection var ansible_connection to ssh 49116 1727204706.58154: Set connection var ansible_timeout to 10 49116 1727204706.58157: Set connection var ansible_shell_executable to /bin/sh 49116 1727204706.58168: Set connection var ansible_pipelining to False 49116 1727204706.58179: Set connection var ansible_shell_type to sh 49116 1727204706.58194: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204706.58278: variable 'ansible_shell_executable' from source: unknown 49116 1727204706.58281: variable 'ansible_connection' from source: unknown 49116 1727204706.58284: variable 'ansible_module_compression' from source: unknown 49116 1727204706.58286: variable 'ansible_shell_type' from source: unknown 49116 1727204706.58290: variable 'ansible_shell_executable' from source: unknown 49116 1727204706.58292: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204706.58294: variable 'ansible_pipelining' from source: unknown 49116 1727204706.58296: variable 'ansible_timeout' from source: unknown 49116 1727204706.58416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204706.58639: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49116 1727204706.58659: variable 'omit' from source: magic vars 49116 1727204706.58673: starting attempt loop 49116 1727204706.58680: running the handler 49116 1727204706.58699: _low_level_execute_command(): starting 49116 1727204706.58711: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204706.59874: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204706.59925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204706.59958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204706.59984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204706.60078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204706.61964: stdout chunk (state=3): >>>/root <<< 49116 1727204706.62107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204706.62330: stderr chunk (state=3): >>><<< 49116 1727204706.62337: stdout chunk (state=3): >>><<< 49116 1727204706.62425: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204706.62430: _low_level_execute_command(): starting 49116 1727204706.62436: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204706.623737-50830-153745061423894 `" && echo ansible-tmp-1727204706.623737-50830-153745061423894="` echo /root/.ansible/tmp/ansible-tmp-1727204706.623737-50830-153745061423894 `" ) && sleep 0' 49116 1727204706.63393: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204706.63487: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204706.63505: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204706.63545: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204706.63654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204706.65854: stdout chunk (state=3): >>>ansible-tmp-1727204706.623737-50830-153745061423894=/root/.ansible/tmp/ansible-tmp-1727204706.623737-50830-153745061423894 <<< 49116 1727204706.66054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204706.66108: stderr chunk (state=3): >>><<< 49116 1727204706.66112: stdout chunk (state=3): >>><<< 49116 1727204706.66114: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204706.623737-50830-153745061423894=/root/.ansible/tmp/ansible-tmp-1727204706.623737-50830-153745061423894 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204706.66232: variable 'ansible_module_compression' from source: unknown 49116 1727204706.66236: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 49116 1727204706.66471: variable 'ansible_facts' from source: unknown 49116 1727204706.66478: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204706.623737-50830-153745061423894/AnsiballZ_package_facts.py 49116 1727204706.66643: Sending initial data 49116 1727204706.66646: Sent initial data (161 bytes) 49116 1727204706.67307: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204706.67325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204706.67336: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204706.67381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204706.67396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204706.67482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204706.69291: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204706.69356: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204706.69429: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmptj2xjb95 /root/.ansible/tmp/ansible-tmp-1727204706.623737-50830-153745061423894/AnsiballZ_package_facts.py <<< 49116 1727204706.69432: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204706.623737-50830-153745061423894/AnsiballZ_package_facts.py" <<< 49116 1727204706.69491: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmptj2xjb95" to remote "/root/.ansible/tmp/ansible-tmp-1727204706.623737-50830-153745061423894/AnsiballZ_package_facts.py" <<< 49116 1727204706.69499: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204706.623737-50830-153745061423894/AnsiballZ_package_facts.py" <<< 49116 1727204706.70721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204706.70797: stderr chunk (state=3): >>><<< 49116 1727204706.70801: stdout chunk (state=3): >>><<< 49116 1727204706.70826: done transferring module to remote 49116 1727204706.70840: _low_level_execute_command(): starting 49116 1727204706.70845: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204706.623737-50830-153745061423894/ /root/.ansible/tmp/ansible-tmp-1727204706.623737-50830-153745061423894/AnsiballZ_package_facts.py && sleep 0' 49116 1727204706.71341: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204706.71346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204706.71348: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204706.71351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204706.71402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204706.71406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204706.71484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204706.73519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204706.73579: stderr chunk (state=3): >>><<< 49116 1727204706.73583: stdout chunk (state=3): >>><<< 49116 1727204706.73596: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204706.73599: _low_level_execute_command(): starting 49116 1727204706.73605: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204706.623737-50830-153745061423894/AnsiballZ_package_facts.py && sleep 0' 49116 1727204706.74104: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204706.74108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204706.74110: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204706.74113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204706.74163: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204706.74173: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204706.74176: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204706.74249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204707.39375: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"na<<< 49116 1727204707.39398: stdout chunk (state=3): >>>me": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40",<<< 49116 1727204707.39407: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.<<< 49116 1727204707.39434: stdout chunk (state=3): >>>fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "li<<< 49116 1727204707.39445: stdout chunk (state=3): >>>breport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-l<<< 49116 1727204707.39451: stdout chunk (state=3): >>>ibs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source"<<< 49116 1727204707.39494: stdout chunk (state=3): >>>: "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1"<<< 49116 1727204707.39540: stdout chunk (state=3): >>>, "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": <<< 49116 1727204707.39546: stdout chunk (state=3): >>>"x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "<<< 49116 1727204707.39552: stdout chunk (state=3): >>>rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}]<<< 49116 1727204707.39592: stdout chunk (state=3): >>>, "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "s<<< 49116 1727204707.39613: stdout chunk (state=3): >>>ource": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.32", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.32", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 49116 1727204707.41694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204707.41760: stderr chunk (state=3): >>><<< 49116 1727204707.41763: stdout chunk (state=3): >>><<< 49116 1727204707.41808: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.34", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.32", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.32", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204707.43572: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204706.623737-50830-153745061423894/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204707.43593: _low_level_execute_command(): starting 49116 1727204707.43597: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204706.623737-50830-153745061423894/ > /dev/null 2>&1 && sleep 0' 49116 1727204707.44102: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204707.44106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204707.44109: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204707.44112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204707.44171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204707.44175: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204707.44181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204707.44255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204707.46345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204707.46411: stderr chunk (state=3): >>><<< 49116 1727204707.46416: stdout chunk (state=3): >>><<< 49116 1727204707.46428: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204707.46438: handler run complete 49116 1727204707.47073: variable 'ansible_facts' from source: unknown 49116 1727204707.47438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204707.49020: variable 'ansible_facts' from source: unknown 49116 1727204707.49356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204707.49921: attempt loop complete, returning result 49116 1727204707.49931: _execute() done 49116 1727204707.49937: dumping result to json 49116 1727204707.50098: done dumping result, returning 49116 1727204707.50107: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-02f7-957b-000000000889] 49116 1727204707.50113: sending task result for task 127b8e07-fff9-02f7-957b-000000000889 49116 1727204707.52016: done sending task result for task 127b8e07-fff9-02f7-957b-000000000889 49116 1727204707.52021: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 49116 1727204707.52117: no more pending results, returning what we have 49116 1727204707.52120: results queue empty 49116 1727204707.52121: checking for any_errors_fatal 49116 1727204707.52126: done checking for any_errors_fatal 49116 1727204707.52127: checking for max_fail_percentage 49116 1727204707.52128: done checking for max_fail_percentage 49116 1727204707.52129: checking to see if all hosts have failed and the running result is not ok 49116 1727204707.52129: done checking to see if all hosts have failed 49116 1727204707.52130: getting the remaining hosts for this loop 49116 1727204707.52131: done getting the remaining hosts for this loop 49116 1727204707.52134: getting the next task for host managed-node3 49116 1727204707.52140: done getting next task for host managed-node3 49116 1727204707.52142: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 49116 1727204707.52144: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204707.52152: getting variables 49116 1727204707.52153: in VariableManager get_vars() 49116 1727204707.52183: Calling all_inventory to load vars for managed-node3 49116 1727204707.52185: Calling groups_inventory to load vars for managed-node3 49116 1727204707.52187: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204707.52195: Calling all_plugins_play to load vars for managed-node3 49116 1727204707.52196: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204707.52198: Calling groups_plugins_play to load vars for managed-node3 49116 1727204707.53122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204707.54323: done with get_vars() 49116 1727204707.54356: done getting variables 49116 1727204707.54414: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:05:07 -0400 (0:00:00.980) 0:00:30.569 ***** 49116 1727204707.54450: entering _queue_task() for managed-node3/debug 49116 1727204707.54738: worker is 1 (out of 1 available) 49116 1727204707.54753: exiting _queue_task() for managed-node3/debug 49116 1727204707.54770: done queuing things up, now waiting for results queue to drain 49116 1727204707.54771: waiting for pending results... 49116 1727204707.54985: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 49116 1727204707.55086: in run() - task 127b8e07-fff9-02f7-957b-000000000066 49116 1727204707.55101: variable 'ansible_search_path' from source: unknown 49116 1727204707.55105: variable 'ansible_search_path' from source: unknown 49116 1727204707.55137: calling self._execute() 49116 1727204707.55228: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204707.55232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204707.55240: variable 'omit' from source: magic vars 49116 1727204707.55556: variable 'ansible_distribution_major_version' from source: facts 49116 1727204707.55564: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204707.55572: variable 'omit' from source: magic vars 49116 1727204707.55618: variable 'omit' from source: magic vars 49116 1727204707.55703: variable 'network_provider' from source: set_fact 49116 1727204707.55718: variable 'omit' from source: magic vars 49116 1727204707.55760: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204707.55796: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204707.55812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204707.55828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204707.55841: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204707.55866: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204707.55870: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204707.55872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204707.55947: Set connection var ansible_connection to ssh 49116 1727204707.55957: Set connection var ansible_timeout to 10 49116 1727204707.55966: Set connection var ansible_shell_executable to /bin/sh 49116 1727204707.55972: Set connection var ansible_pipelining to False 49116 1727204707.55975: Set connection var ansible_shell_type to sh 49116 1727204707.55980: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204707.56002: variable 'ansible_shell_executable' from source: unknown 49116 1727204707.56006: variable 'ansible_connection' from source: unknown 49116 1727204707.56009: variable 'ansible_module_compression' from source: unknown 49116 1727204707.56011: variable 'ansible_shell_type' from source: unknown 49116 1727204707.56014: variable 'ansible_shell_executable' from source: unknown 49116 1727204707.56017: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204707.56020: variable 'ansible_pipelining' from source: unknown 49116 1727204707.56023: variable 'ansible_timeout' from source: unknown 49116 1727204707.56028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204707.56151: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204707.56160: variable 'omit' from source: magic vars 49116 1727204707.56168: starting attempt loop 49116 1727204707.56172: running the handler 49116 1727204707.56215: handler run complete 49116 1727204707.56224: attempt loop complete, returning result 49116 1727204707.56227: _execute() done 49116 1727204707.56230: dumping result to json 49116 1727204707.56233: done dumping result, returning 49116 1727204707.56242: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-02f7-957b-000000000066] 49116 1727204707.56247: sending task result for task 127b8e07-fff9-02f7-957b-000000000066 49116 1727204707.56336: done sending task result for task 127b8e07-fff9-02f7-957b-000000000066 49116 1727204707.56339: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: Using network provider: nm 49116 1727204707.56414: no more pending results, returning what we have 49116 1727204707.56418: results queue empty 49116 1727204707.56419: checking for any_errors_fatal 49116 1727204707.56429: done checking for any_errors_fatal 49116 1727204707.56429: checking for max_fail_percentage 49116 1727204707.56431: done checking for max_fail_percentage 49116 1727204707.56432: checking to see if all hosts have failed and the running result is not ok 49116 1727204707.56433: done checking to see if all hosts have failed 49116 1727204707.56434: getting the remaining hosts for this loop 49116 1727204707.56435: done getting the remaining hosts for this loop 49116 1727204707.56440: getting the next task for host managed-node3 49116 1727204707.56446: done getting next task for host managed-node3 49116 1727204707.56452: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 49116 1727204707.56455: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204707.56475: getting variables 49116 1727204707.56477: in VariableManager get_vars() 49116 1727204707.56516: Calling all_inventory to load vars for managed-node3 49116 1727204707.56519: Calling groups_inventory to load vars for managed-node3 49116 1727204707.56521: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204707.56530: Calling all_plugins_play to load vars for managed-node3 49116 1727204707.56533: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204707.56536: Calling groups_plugins_play to load vars for managed-node3 49116 1727204707.62076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204707.63496: done with get_vars() 49116 1727204707.63539: done getting variables 49116 1727204707.63600: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:05:07 -0400 (0:00:00.091) 0:00:30.661 ***** 49116 1727204707.63638: entering _queue_task() for managed-node3/fail 49116 1727204707.64044: worker is 1 (out of 1 available) 49116 1727204707.64059: exiting _queue_task() for managed-node3/fail 49116 1727204707.64076: done queuing things up, now waiting for results queue to drain 49116 1727204707.64078: waiting for pending results... 49116 1727204707.64493: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 49116 1727204707.64605: in run() - task 127b8e07-fff9-02f7-957b-000000000067 49116 1727204707.64628: variable 'ansible_search_path' from source: unknown 49116 1727204707.64637: variable 'ansible_search_path' from source: unknown 49116 1727204707.64685: calling self._execute() 49116 1727204707.64807: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204707.64822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204707.64839: variable 'omit' from source: magic vars 49116 1727204707.65294: variable 'ansible_distribution_major_version' from source: facts 49116 1727204707.65318: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204707.65476: variable 'network_state' from source: role '' defaults 49116 1727204707.65496: Evaluated conditional (network_state != {}): False 49116 1727204707.65506: when evaluation is False, skipping this task 49116 1727204707.65569: _execute() done 49116 1727204707.65576: dumping result to json 49116 1727204707.65580: done dumping result, returning 49116 1727204707.65584: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-02f7-957b-000000000067] 49116 1727204707.65587: sending task result for task 127b8e07-fff9-02f7-957b-000000000067 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 49116 1727204707.65855: no more pending results, returning what we have 49116 1727204707.65860: results queue empty 49116 1727204707.65861: checking for any_errors_fatal 49116 1727204707.65873: done checking for any_errors_fatal 49116 1727204707.65874: checking for max_fail_percentage 49116 1727204707.65877: done checking for max_fail_percentage 49116 1727204707.65878: checking to see if all hosts have failed and the running result is not ok 49116 1727204707.65879: done checking to see if all hosts have failed 49116 1727204707.65880: getting the remaining hosts for this loop 49116 1727204707.65882: done getting the remaining hosts for this loop 49116 1727204707.65887: getting the next task for host managed-node3 49116 1727204707.65896: done getting next task for host managed-node3 49116 1727204707.65901: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 49116 1727204707.65905: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204707.65931: getting variables 49116 1727204707.65933: in VariableManager get_vars() 49116 1727204707.66186: Calling all_inventory to load vars for managed-node3 49116 1727204707.66190: Calling groups_inventory to load vars for managed-node3 49116 1727204707.66193: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204707.66206: Calling all_plugins_play to load vars for managed-node3 49116 1727204707.66209: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204707.66213: Calling groups_plugins_play to load vars for managed-node3 49116 1727204707.66886: done sending task result for task 127b8e07-fff9-02f7-957b-000000000067 49116 1727204707.66891: WORKER PROCESS EXITING 49116 1727204707.68012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204707.69305: done with get_vars() 49116 1727204707.69336: done getting variables 49116 1727204707.69390: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:05:07 -0400 (0:00:00.057) 0:00:30.719 ***** 49116 1727204707.69418: entering _queue_task() for managed-node3/fail 49116 1727204707.69709: worker is 1 (out of 1 available) 49116 1727204707.69725: exiting _queue_task() for managed-node3/fail 49116 1727204707.69738: done queuing things up, now waiting for results queue to drain 49116 1727204707.69739: waiting for pending results... 49116 1727204707.69964: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 49116 1727204707.70071: in run() - task 127b8e07-fff9-02f7-957b-000000000068 49116 1727204707.70084: variable 'ansible_search_path' from source: unknown 49116 1727204707.70088: variable 'ansible_search_path' from source: unknown 49116 1727204707.70126: calling self._execute() 49116 1727204707.70221: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204707.70226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204707.70234: variable 'omit' from source: magic vars 49116 1727204707.70562: variable 'ansible_distribution_major_version' from source: facts 49116 1727204707.70575: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204707.70673: variable 'network_state' from source: role '' defaults 49116 1727204707.70683: Evaluated conditional (network_state != {}): False 49116 1727204707.70687: when evaluation is False, skipping this task 49116 1727204707.70690: _execute() done 49116 1727204707.70693: dumping result to json 49116 1727204707.70695: done dumping result, returning 49116 1727204707.70704: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-02f7-957b-000000000068] 49116 1727204707.70707: sending task result for task 127b8e07-fff9-02f7-957b-000000000068 49116 1727204707.70807: done sending task result for task 127b8e07-fff9-02f7-957b-000000000068 49116 1727204707.70810: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 49116 1727204707.70890: no more pending results, returning what we have 49116 1727204707.70894: results queue empty 49116 1727204707.70896: checking for any_errors_fatal 49116 1727204707.70905: done checking for any_errors_fatal 49116 1727204707.70906: checking for max_fail_percentage 49116 1727204707.70907: done checking for max_fail_percentage 49116 1727204707.70908: checking to see if all hosts have failed and the running result is not ok 49116 1727204707.70909: done checking to see if all hosts have failed 49116 1727204707.70910: getting the remaining hosts for this loop 49116 1727204707.70911: done getting the remaining hosts for this loop 49116 1727204707.70915: getting the next task for host managed-node3 49116 1727204707.70924: done getting next task for host managed-node3 49116 1727204707.70928: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 49116 1727204707.70931: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204707.70950: getting variables 49116 1727204707.70951: in VariableManager get_vars() 49116 1727204707.70992: Calling all_inventory to load vars for managed-node3 49116 1727204707.70994: Calling groups_inventory to load vars for managed-node3 49116 1727204707.70996: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204707.71006: Calling all_plugins_play to load vars for managed-node3 49116 1727204707.71009: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204707.71011: Calling groups_plugins_play to load vars for managed-node3 49116 1727204707.72169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204707.73386: done with get_vars() 49116 1727204707.73413: done getting variables 49116 1727204707.73468: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:05:07 -0400 (0:00:00.040) 0:00:30.759 ***** 49116 1727204707.73496: entering _queue_task() for managed-node3/fail 49116 1727204707.73788: worker is 1 (out of 1 available) 49116 1727204707.73806: exiting _queue_task() for managed-node3/fail 49116 1727204707.73820: done queuing things up, now waiting for results queue to drain 49116 1727204707.73822: waiting for pending results... 49116 1727204707.74037: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 49116 1727204707.74148: in run() - task 127b8e07-fff9-02f7-957b-000000000069 49116 1727204707.74166: variable 'ansible_search_path' from source: unknown 49116 1727204707.74170: variable 'ansible_search_path' from source: unknown 49116 1727204707.74205: calling self._execute() 49116 1727204707.74294: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204707.74299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204707.74309: variable 'omit' from source: magic vars 49116 1727204707.74644: variable 'ansible_distribution_major_version' from source: facts 49116 1727204707.74655: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204707.74801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204707.76584: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204707.76644: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204707.76679: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204707.76706: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204707.76726: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204707.76799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204707.76822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204707.76844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204707.76875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204707.76889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204707.76970: variable 'ansible_distribution_major_version' from source: facts 49116 1727204707.76985: Evaluated conditional (ansible_distribution_major_version | int > 9): True 49116 1727204707.77081: variable 'ansible_distribution' from source: facts 49116 1727204707.77087: variable '__network_rh_distros' from source: role '' defaults 49116 1727204707.77094: Evaluated conditional (ansible_distribution in __network_rh_distros): False 49116 1727204707.77098: when evaluation is False, skipping this task 49116 1727204707.77102: _execute() done 49116 1727204707.77104: dumping result to json 49116 1727204707.77108: done dumping result, returning 49116 1727204707.77117: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-02f7-957b-000000000069] 49116 1727204707.77120: sending task result for task 127b8e07-fff9-02f7-957b-000000000069 49116 1727204707.77225: done sending task result for task 127b8e07-fff9-02f7-957b-000000000069 49116 1727204707.77228: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 49116 1727204707.77280: no more pending results, returning what we have 49116 1727204707.77283: results queue empty 49116 1727204707.77285: checking for any_errors_fatal 49116 1727204707.77292: done checking for any_errors_fatal 49116 1727204707.77292: checking for max_fail_percentage 49116 1727204707.77295: done checking for max_fail_percentage 49116 1727204707.77296: checking to see if all hosts have failed and the running result is not ok 49116 1727204707.77296: done checking to see if all hosts have failed 49116 1727204707.77297: getting the remaining hosts for this loop 49116 1727204707.77298: done getting the remaining hosts for this loop 49116 1727204707.77303: getting the next task for host managed-node3 49116 1727204707.77310: done getting next task for host managed-node3 49116 1727204707.77314: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 49116 1727204707.77317: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204707.77339: getting variables 49116 1727204707.77341: in VariableManager get_vars() 49116 1727204707.77387: Calling all_inventory to load vars for managed-node3 49116 1727204707.77390: Calling groups_inventory to load vars for managed-node3 49116 1727204707.77392: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204707.77403: Calling all_plugins_play to load vars for managed-node3 49116 1727204707.77406: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204707.77408: Calling groups_plugins_play to load vars for managed-node3 49116 1727204707.78582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204707.79993: done with get_vars() 49116 1727204707.80023: done getting variables 49116 1727204707.80077: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:05:07 -0400 (0:00:00.066) 0:00:30.826 ***** 49116 1727204707.80105: entering _queue_task() for managed-node3/dnf 49116 1727204707.80395: worker is 1 (out of 1 available) 49116 1727204707.80412: exiting _queue_task() for managed-node3/dnf 49116 1727204707.80425: done queuing things up, now waiting for results queue to drain 49116 1727204707.80427: waiting for pending results... 49116 1727204707.80642: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 49116 1727204707.80751: in run() - task 127b8e07-fff9-02f7-957b-00000000006a 49116 1727204707.80765: variable 'ansible_search_path' from source: unknown 49116 1727204707.80776: variable 'ansible_search_path' from source: unknown 49116 1727204707.80806: calling self._execute() 49116 1727204707.80971: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204707.80975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204707.80978: variable 'omit' from source: magic vars 49116 1727204707.81420: variable 'ansible_distribution_major_version' from source: facts 49116 1727204707.81428: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204707.81657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204707.84313: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204707.84399: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204707.84454: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204707.84518: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204707.84542: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204707.84673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204707.84692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204707.84724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204707.84783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204707.84844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204707.84951: variable 'ansible_distribution' from source: facts 49116 1727204707.84965: variable 'ansible_distribution_major_version' from source: facts 49116 1727204707.84981: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 49116 1727204707.85120: variable '__network_wireless_connections_defined' from source: role '' defaults 49116 1727204707.85291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204707.85325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204707.85373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204707.85398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204707.85438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204707.85470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204707.85488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204707.85508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204707.85544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204707.85555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204707.85589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204707.85609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204707.85635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204707.85667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204707.85678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204707.85800: variable 'network_connections' from source: task vars 49116 1727204707.85811: variable 'interface' from source: play vars 49116 1727204707.85869: variable 'interface' from source: play vars 49116 1727204707.85878: variable 'vlan_interface' from source: play vars 49116 1727204707.85925: variable 'vlan_interface' from source: play vars 49116 1727204707.85989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49116 1727204707.86128: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49116 1727204707.86165: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49116 1727204707.86190: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49116 1727204707.86215: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49116 1727204707.86253: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49116 1727204707.86273: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49116 1727204707.86299: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204707.86315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49116 1727204707.86360: variable '__network_team_connections_defined' from source: role '' defaults 49116 1727204707.86535: variable 'network_connections' from source: task vars 49116 1727204707.86543: variable 'interface' from source: play vars 49116 1727204707.86591: variable 'interface' from source: play vars 49116 1727204707.86601: variable 'vlan_interface' from source: play vars 49116 1727204707.86647: variable 'vlan_interface' from source: play vars 49116 1727204707.86666: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 49116 1727204707.86670: when evaluation is False, skipping this task 49116 1727204707.86673: _execute() done 49116 1727204707.86676: dumping result to json 49116 1727204707.86678: done dumping result, returning 49116 1727204707.86687: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-02f7-957b-00000000006a] 49116 1727204707.86691: sending task result for task 127b8e07-fff9-02f7-957b-00000000006a 49116 1727204707.86789: done sending task result for task 127b8e07-fff9-02f7-957b-00000000006a 49116 1727204707.86792: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 49116 1727204707.86865: no more pending results, returning what we have 49116 1727204707.86870: results queue empty 49116 1727204707.86871: checking for any_errors_fatal 49116 1727204707.86877: done checking for any_errors_fatal 49116 1727204707.86877: checking for max_fail_percentage 49116 1727204707.86879: done checking for max_fail_percentage 49116 1727204707.86881: checking to see if all hosts have failed and the running result is not ok 49116 1727204707.86881: done checking to see if all hosts have failed 49116 1727204707.86882: getting the remaining hosts for this loop 49116 1727204707.86883: done getting the remaining hosts for this loop 49116 1727204707.86888: getting the next task for host managed-node3 49116 1727204707.86895: done getting next task for host managed-node3 49116 1727204707.86899: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 49116 1727204707.86903: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204707.86923: getting variables 49116 1727204707.86925: in VariableManager get_vars() 49116 1727204707.86974: Calling all_inventory to load vars for managed-node3 49116 1727204707.86978: Calling groups_inventory to load vars for managed-node3 49116 1727204707.86980: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204707.86990: Calling all_plugins_play to load vars for managed-node3 49116 1727204707.86993: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204707.86995: Calling groups_plugins_play to load vars for managed-node3 49116 1727204707.88543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204707.90754: done with get_vars() 49116 1727204707.90794: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 49116 1727204707.90885: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:05:07 -0400 (0:00:00.108) 0:00:30.934 ***** 49116 1727204707.90921: entering _queue_task() for managed-node3/yum 49116 1727204707.91408: worker is 1 (out of 1 available) 49116 1727204707.91423: exiting _queue_task() for managed-node3/yum 49116 1727204707.91434: done queuing things up, now waiting for results queue to drain 49116 1727204707.91435: waiting for pending results... 49116 1727204707.91786: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 49116 1727204707.91973: in run() - task 127b8e07-fff9-02f7-957b-00000000006b 49116 1727204707.91977: variable 'ansible_search_path' from source: unknown 49116 1727204707.91980: variable 'ansible_search_path' from source: unknown 49116 1727204707.91983: calling self._execute() 49116 1727204707.92054: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204707.92068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204707.92083: variable 'omit' from source: magic vars 49116 1727204707.92512: variable 'ansible_distribution_major_version' from source: facts 49116 1727204707.92536: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204707.92745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204707.95832: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204707.95910: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204707.96033: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204707.96036: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204707.96040: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204707.96137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204707.96178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204707.96211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204707.96360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204707.96364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204707.96394: variable 'ansible_distribution_major_version' from source: facts 49116 1727204707.96418: Evaluated conditional (ansible_distribution_major_version | int < 8): False 49116 1727204707.96426: when evaluation is False, skipping this task 49116 1727204707.96433: _execute() done 49116 1727204707.96440: dumping result to json 49116 1727204707.96447: done dumping result, returning 49116 1727204707.96460: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-02f7-957b-00000000006b] 49116 1727204707.96484: sending task result for task 127b8e07-fff9-02f7-957b-00000000006b skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 49116 1727204707.96650: no more pending results, returning what we have 49116 1727204707.96654: results queue empty 49116 1727204707.96655: checking for any_errors_fatal 49116 1727204707.96662: done checking for any_errors_fatal 49116 1727204707.96663: checking for max_fail_percentage 49116 1727204707.96665: done checking for max_fail_percentage 49116 1727204707.96769: checking to see if all hosts have failed and the running result is not ok 49116 1727204707.96771: done checking to see if all hosts have failed 49116 1727204707.96772: getting the remaining hosts for this loop 49116 1727204707.96774: done getting the remaining hosts for this loop 49116 1727204707.96780: getting the next task for host managed-node3 49116 1727204707.96788: done getting next task for host managed-node3 49116 1727204707.96793: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 49116 1727204707.96796: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204707.96818: getting variables 49116 1727204707.96820: in VariableManager get_vars() 49116 1727204707.97069: Calling all_inventory to load vars for managed-node3 49116 1727204707.97073: Calling groups_inventory to load vars for managed-node3 49116 1727204707.97076: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204707.97089: Calling all_plugins_play to load vars for managed-node3 49116 1727204707.97092: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204707.97095: Calling groups_plugins_play to load vars for managed-node3 49116 1727204707.97787: done sending task result for task 127b8e07-fff9-02f7-957b-00000000006b 49116 1727204707.97792: WORKER PROCESS EXITING 49116 1727204708.00382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204708.02613: done with get_vars() 49116 1727204708.02653: done getting variables 49116 1727204708.02723: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:05:08 -0400 (0:00:00.118) 0:00:31.052 ***** 49116 1727204708.02762: entering _queue_task() for managed-node3/fail 49116 1727204708.03270: worker is 1 (out of 1 available) 49116 1727204708.03285: exiting _queue_task() for managed-node3/fail 49116 1727204708.03297: done queuing things up, now waiting for results queue to drain 49116 1727204708.03299: waiting for pending results... 49116 1727204708.03533: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 49116 1727204708.03707: in run() - task 127b8e07-fff9-02f7-957b-00000000006c 49116 1727204708.03729: variable 'ansible_search_path' from source: unknown 49116 1727204708.03737: variable 'ansible_search_path' from source: unknown 49116 1727204708.03789: calling self._execute() 49116 1727204708.03907: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204708.03920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204708.03933: variable 'omit' from source: magic vars 49116 1727204708.04352: variable 'ansible_distribution_major_version' from source: facts 49116 1727204708.04373: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204708.04518: variable '__network_wireless_connections_defined' from source: role '' defaults 49116 1727204708.04751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204708.07297: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204708.07392: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204708.07436: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204708.07482: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204708.07519: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204708.07622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204708.07771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204708.07775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204708.07778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204708.07781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204708.07812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204708.07842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204708.07876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204708.07928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204708.07948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204708.08005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204708.08035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204708.08065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204708.08118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204708.08138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204708.08439: variable 'network_connections' from source: task vars 49116 1727204708.08443: variable 'interface' from source: play vars 49116 1727204708.08446: variable 'interface' from source: play vars 49116 1727204708.08462: variable 'vlan_interface' from source: play vars 49116 1727204708.08535: variable 'vlan_interface' from source: play vars 49116 1727204708.08615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49116 1727204708.08810: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49116 1727204708.08857: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49116 1727204708.08896: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49116 1727204708.08925: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49116 1727204708.08976: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49116 1727204708.09010: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49116 1727204708.09309: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204708.09313: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49116 1727204708.09316: variable '__network_team_connections_defined' from source: role '' defaults 49116 1727204708.09874: variable 'network_connections' from source: task vars 49116 1727204708.09891: variable 'interface' from source: play vars 49116 1727204708.10101: variable 'interface' from source: play vars 49116 1727204708.10118: variable 'vlan_interface' from source: play vars 49116 1727204708.10271: variable 'vlan_interface' from source: play vars 49116 1727204708.10338: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 49116 1727204708.10347: when evaluation is False, skipping this task 49116 1727204708.10355: _execute() done 49116 1727204708.10471: dumping result to json 49116 1727204708.10474: done dumping result, returning 49116 1727204708.10477: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-02f7-957b-00000000006c] 49116 1727204708.10488: sending task result for task 127b8e07-fff9-02f7-957b-00000000006c skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 49116 1727204708.10726: no more pending results, returning what we have 49116 1727204708.10730: results queue empty 49116 1727204708.10732: checking for any_errors_fatal 49116 1727204708.10737: done checking for any_errors_fatal 49116 1727204708.10738: checking for max_fail_percentage 49116 1727204708.10741: done checking for max_fail_percentage 49116 1727204708.10742: checking to see if all hosts have failed and the running result is not ok 49116 1727204708.10743: done checking to see if all hosts have failed 49116 1727204708.10744: getting the remaining hosts for this loop 49116 1727204708.10745: done getting the remaining hosts for this loop 49116 1727204708.10750: getting the next task for host managed-node3 49116 1727204708.10759: done getting next task for host managed-node3 49116 1727204708.10763: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 49116 1727204708.10768: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204708.10791: getting variables 49116 1727204708.10793: in VariableManager get_vars() 49116 1727204708.10841: Calling all_inventory to load vars for managed-node3 49116 1727204708.10844: Calling groups_inventory to load vars for managed-node3 49116 1727204708.10847: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204708.10861: Calling all_plugins_play to load vars for managed-node3 49116 1727204708.10864: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204708.11287: Calling groups_plugins_play to load vars for managed-node3 49116 1727204708.12312: done sending task result for task 127b8e07-fff9-02f7-957b-00000000006c 49116 1727204708.12316: WORKER PROCESS EXITING 49116 1727204708.14732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204708.16940: done with get_vars() 49116 1727204708.16982: done getting variables 49116 1727204708.17050: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:05:08 -0400 (0:00:00.143) 0:00:31.195 ***** 49116 1727204708.17091: entering _queue_task() for managed-node3/package 49116 1727204708.17494: worker is 1 (out of 1 available) 49116 1727204708.17510: exiting _queue_task() for managed-node3/package 49116 1727204708.17524: done queuing things up, now waiting for results queue to drain 49116 1727204708.17525: waiting for pending results... 49116 1727204708.17851: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 49116 1727204708.18021: in run() - task 127b8e07-fff9-02f7-957b-00000000006d 49116 1727204708.18043: variable 'ansible_search_path' from source: unknown 49116 1727204708.18051: variable 'ansible_search_path' from source: unknown 49116 1727204708.18102: calling self._execute() 49116 1727204708.18215: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204708.18227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204708.18242: variable 'omit' from source: magic vars 49116 1727204708.18661: variable 'ansible_distribution_major_version' from source: facts 49116 1727204708.18681: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204708.18911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49116 1727204708.19214: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49116 1727204708.19276: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49116 1727204708.19316: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49116 1727204708.19409: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49116 1727204708.19549: variable 'network_packages' from source: role '' defaults 49116 1727204708.19688: variable '__network_provider_setup' from source: role '' defaults 49116 1727204708.19707: variable '__network_service_name_default_nm' from source: role '' defaults 49116 1727204708.19787: variable '__network_service_name_default_nm' from source: role '' defaults 49116 1727204708.19801: variable '__network_packages_default_nm' from source: role '' defaults 49116 1727204708.19870: variable '__network_packages_default_nm' from source: role '' defaults 49116 1727204708.20089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204708.23139: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204708.23373: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204708.23460: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204708.23508: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204708.23584: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204708.23973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204708.23979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204708.23983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204708.24075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204708.24128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204708.24316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204708.24337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204708.24372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204708.24470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204708.24522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204708.25178: variable '__network_packages_default_gobject_packages' from source: role '' defaults 49116 1727204708.25569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204708.25605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204708.25657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204708.25707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204708.25727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204708.25840: variable 'ansible_python' from source: facts 49116 1727204708.25882: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 49116 1727204708.25986: variable '__network_wpa_supplicant_required' from source: role '' defaults 49116 1727204708.26084: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 49116 1727204708.26239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204708.26291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204708.26310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204708.26357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204708.26400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204708.26443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204708.26508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204708.26521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204708.26571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204708.26591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204708.26834: variable 'network_connections' from source: task vars 49116 1727204708.26838: variable 'interface' from source: play vars 49116 1727204708.26889: variable 'interface' from source: play vars 49116 1727204708.26905: variable 'vlan_interface' from source: play vars 49116 1727204708.27020: variable 'vlan_interface' from source: play vars 49116 1727204708.27112: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49116 1727204708.27144: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49116 1727204708.27187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204708.27225: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49116 1727204708.27298: variable '__network_wireless_connections_defined' from source: role '' defaults 49116 1727204708.27633: variable 'network_connections' from source: task vars 49116 1727204708.27644: variable 'interface' from source: play vars 49116 1727204708.27761: variable 'interface' from source: play vars 49116 1727204708.27782: variable 'vlan_interface' from source: play vars 49116 1727204708.27926: variable 'vlan_interface' from source: play vars 49116 1727204708.27946: variable '__network_packages_default_wireless' from source: role '' defaults 49116 1727204708.28039: variable '__network_wireless_connections_defined' from source: role '' defaults 49116 1727204708.28408: variable 'network_connections' from source: task vars 49116 1727204708.28471: variable 'interface' from source: play vars 49116 1727204708.28497: variable 'interface' from source: play vars 49116 1727204708.28511: variable 'vlan_interface' from source: play vars 49116 1727204708.28588: variable 'vlan_interface' from source: play vars 49116 1727204708.28618: variable '__network_packages_default_team' from source: role '' defaults 49116 1727204708.28718: variable '__network_team_connections_defined' from source: role '' defaults 49116 1727204708.29077: variable 'network_connections' from source: task vars 49116 1727204708.29087: variable 'interface' from source: play vars 49116 1727204708.29160: variable 'interface' from source: play vars 49116 1727204708.29235: variable 'vlan_interface' from source: play vars 49116 1727204708.29248: variable 'vlan_interface' from source: play vars 49116 1727204708.29318: variable '__network_service_name_default_initscripts' from source: role '' defaults 49116 1727204708.29392: variable '__network_service_name_default_initscripts' from source: role '' defaults 49116 1727204708.29404: variable '__network_packages_default_initscripts' from source: role '' defaults 49116 1727204708.29475: variable '__network_packages_default_initscripts' from source: role '' defaults 49116 1727204708.29729: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 49116 1727204708.30299: variable 'network_connections' from source: task vars 49116 1727204708.30310: variable 'interface' from source: play vars 49116 1727204708.30383: variable 'interface' from source: play vars 49116 1727204708.30396: variable 'vlan_interface' from source: play vars 49116 1727204708.30476: variable 'vlan_interface' from source: play vars 49116 1727204708.30542: variable 'ansible_distribution' from source: facts 49116 1727204708.30545: variable '__network_rh_distros' from source: role '' defaults 49116 1727204708.30547: variable 'ansible_distribution_major_version' from source: facts 49116 1727204708.30550: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 49116 1727204708.30710: variable 'ansible_distribution' from source: facts 49116 1727204708.30719: variable '__network_rh_distros' from source: role '' defaults 49116 1727204708.30728: variable 'ansible_distribution_major_version' from source: facts 49116 1727204708.30739: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 49116 1727204708.30924: variable 'ansible_distribution' from source: facts 49116 1727204708.30933: variable '__network_rh_distros' from source: role '' defaults 49116 1727204708.30943: variable 'ansible_distribution_major_version' from source: facts 49116 1727204708.30986: variable 'network_provider' from source: set_fact 49116 1727204708.31007: variable 'ansible_facts' from source: unknown 49116 1727204708.31962: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 49116 1727204708.32173: when evaluation is False, skipping this task 49116 1727204708.32177: _execute() done 49116 1727204708.32180: dumping result to json 49116 1727204708.32182: done dumping result, returning 49116 1727204708.32185: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-02f7-957b-00000000006d] 49116 1727204708.32187: sending task result for task 127b8e07-fff9-02f7-957b-00000000006d 49116 1727204708.32272: done sending task result for task 127b8e07-fff9-02f7-957b-00000000006d 49116 1727204708.32276: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 49116 1727204708.32331: no more pending results, returning what we have 49116 1727204708.32335: results queue empty 49116 1727204708.32336: checking for any_errors_fatal 49116 1727204708.32346: done checking for any_errors_fatal 49116 1727204708.32347: checking for max_fail_percentage 49116 1727204708.32349: done checking for max_fail_percentage 49116 1727204708.32350: checking to see if all hosts have failed and the running result is not ok 49116 1727204708.32351: done checking to see if all hosts have failed 49116 1727204708.32352: getting the remaining hosts for this loop 49116 1727204708.32353: done getting the remaining hosts for this loop 49116 1727204708.32358: getting the next task for host managed-node3 49116 1727204708.32367: done getting next task for host managed-node3 49116 1727204708.32372: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 49116 1727204708.32375: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204708.32403: getting variables 49116 1727204708.32405: in VariableManager get_vars() 49116 1727204708.32451: Calling all_inventory to load vars for managed-node3 49116 1727204708.32454: Calling groups_inventory to load vars for managed-node3 49116 1727204708.32457: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204708.32672: Calling all_plugins_play to load vars for managed-node3 49116 1727204708.32677: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204708.32682: Calling groups_plugins_play to load vars for managed-node3 49116 1727204708.34438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204708.38377: done with get_vars() 49116 1727204708.38422: done getting variables 49116 1727204708.38720: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:05:08 -0400 (0:00:00.216) 0:00:31.412 ***** 49116 1727204708.38759: entering _queue_task() for managed-node3/package 49116 1727204708.39289: worker is 1 (out of 1 available) 49116 1727204708.39304: exiting _queue_task() for managed-node3/package 49116 1727204708.39317: done queuing things up, now waiting for results queue to drain 49116 1727204708.39319: waiting for pending results... 49116 1727204708.39670: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 49116 1727204708.39744: in run() - task 127b8e07-fff9-02f7-957b-00000000006e 49116 1727204708.39769: variable 'ansible_search_path' from source: unknown 49116 1727204708.39778: variable 'ansible_search_path' from source: unknown 49116 1727204708.39828: calling self._execute() 49116 1727204708.39954: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204708.39968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204708.39984: variable 'omit' from source: magic vars 49116 1727204708.40416: variable 'ansible_distribution_major_version' from source: facts 49116 1727204708.40441: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204708.40655: variable 'network_state' from source: role '' defaults 49116 1727204708.40658: Evaluated conditional (network_state != {}): False 49116 1727204708.40660: when evaluation is False, skipping this task 49116 1727204708.40663: _execute() done 49116 1727204708.40666: dumping result to json 49116 1727204708.40669: done dumping result, returning 49116 1727204708.40672: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-02f7-957b-00000000006e] 49116 1727204708.40675: sending task result for task 127b8e07-fff9-02f7-957b-00000000006e skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 49116 1727204708.41015: no more pending results, returning what we have 49116 1727204708.41018: results queue empty 49116 1727204708.41020: checking for any_errors_fatal 49116 1727204708.41025: done checking for any_errors_fatal 49116 1727204708.41026: checking for max_fail_percentage 49116 1727204708.41028: done checking for max_fail_percentage 49116 1727204708.41028: checking to see if all hosts have failed and the running result is not ok 49116 1727204708.41029: done checking to see if all hosts have failed 49116 1727204708.41030: getting the remaining hosts for this loop 49116 1727204708.41032: done getting the remaining hosts for this loop 49116 1727204708.41036: getting the next task for host managed-node3 49116 1727204708.41043: done getting next task for host managed-node3 49116 1727204708.41048: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 49116 1727204708.41051: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204708.41074: getting variables 49116 1727204708.41076: in VariableManager get_vars() 49116 1727204708.41123: Calling all_inventory to load vars for managed-node3 49116 1727204708.41126: Calling groups_inventory to load vars for managed-node3 49116 1727204708.41129: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204708.41143: Calling all_plugins_play to load vars for managed-node3 49116 1727204708.41146: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204708.41150: Calling groups_plugins_play to load vars for managed-node3 49116 1727204708.41697: done sending task result for task 127b8e07-fff9-02f7-957b-00000000006e 49116 1727204708.41701: WORKER PROCESS EXITING 49116 1727204708.43175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204708.45354: done with get_vars() 49116 1727204708.45393: done getting variables 49116 1727204708.45460: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:05:08 -0400 (0:00:00.067) 0:00:31.480 ***** 49116 1727204708.45503: entering _queue_task() for managed-node3/package 49116 1727204708.45995: worker is 1 (out of 1 available) 49116 1727204708.46007: exiting _queue_task() for managed-node3/package 49116 1727204708.46023: done queuing things up, now waiting for results queue to drain 49116 1727204708.46024: waiting for pending results... 49116 1727204708.46251: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 49116 1727204708.46414: in run() - task 127b8e07-fff9-02f7-957b-00000000006f 49116 1727204708.46438: variable 'ansible_search_path' from source: unknown 49116 1727204708.46446: variable 'ansible_search_path' from source: unknown 49116 1727204708.46495: calling self._execute() 49116 1727204708.46610: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204708.46622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204708.46636: variable 'omit' from source: magic vars 49116 1727204708.47054: variable 'ansible_distribution_major_version' from source: facts 49116 1727204708.47074: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204708.47216: variable 'network_state' from source: role '' defaults 49116 1727204708.47236: Evaluated conditional (network_state != {}): False 49116 1727204708.47244: when evaluation is False, skipping this task 49116 1727204708.47252: _execute() done 49116 1727204708.47258: dumping result to json 49116 1727204708.47268: done dumping result, returning 49116 1727204708.47279: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-02f7-957b-00000000006f] 49116 1727204708.47290: sending task result for task 127b8e07-fff9-02f7-957b-00000000006f skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 49116 1727204708.47473: no more pending results, returning what we have 49116 1727204708.47478: results queue empty 49116 1727204708.47479: checking for any_errors_fatal 49116 1727204708.47488: done checking for any_errors_fatal 49116 1727204708.47489: checking for max_fail_percentage 49116 1727204708.47492: done checking for max_fail_percentage 49116 1727204708.47493: checking to see if all hosts have failed and the running result is not ok 49116 1727204708.47494: done checking to see if all hosts have failed 49116 1727204708.47495: getting the remaining hosts for this loop 49116 1727204708.47496: done getting the remaining hosts for this loop 49116 1727204708.47500: getting the next task for host managed-node3 49116 1727204708.47509: done getting next task for host managed-node3 49116 1727204708.47514: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 49116 1727204708.47518: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204708.47546: getting variables 49116 1727204708.47548: in VariableManager get_vars() 49116 1727204708.47699: Calling all_inventory to load vars for managed-node3 49116 1727204708.47703: Calling groups_inventory to load vars for managed-node3 49116 1727204708.47705: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204708.47721: Calling all_plugins_play to load vars for managed-node3 49116 1727204708.47724: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204708.47728: Calling groups_plugins_play to load vars for managed-node3 49116 1727204708.48586: done sending task result for task 127b8e07-fff9-02f7-957b-00000000006f 49116 1727204708.48590: WORKER PROCESS EXITING 49116 1727204708.50469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204708.54919: done with get_vars() 49116 1727204708.54958: done getting variables 49116 1727204708.55329: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:05:08 -0400 (0:00:00.098) 0:00:31.578 ***** 49116 1727204708.55372: entering _queue_task() for managed-node3/service 49116 1727204708.56262: worker is 1 (out of 1 available) 49116 1727204708.56281: exiting _queue_task() for managed-node3/service 49116 1727204708.56297: done queuing things up, now waiting for results queue to drain 49116 1727204708.56298: waiting for pending results... 49116 1727204708.56813: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 49116 1727204708.57272: in run() - task 127b8e07-fff9-02f7-957b-000000000070 49116 1727204708.57277: variable 'ansible_search_path' from source: unknown 49116 1727204708.57281: variable 'ansible_search_path' from source: unknown 49116 1727204708.57285: calling self._execute() 49116 1727204708.57565: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204708.57872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204708.57877: variable 'omit' from source: magic vars 49116 1727204708.58574: variable 'ansible_distribution_major_version' from source: facts 49116 1727204708.58671: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204708.59037: variable '__network_wireless_connections_defined' from source: role '' defaults 49116 1727204708.59428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204708.65312: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204708.65974: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204708.65980: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204708.65984: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204708.65988: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204708.66373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204708.66412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204708.66499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204708.66716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204708.66737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204708.66801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204708.66836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204708.66867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204708.67113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204708.67371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204708.67376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204708.67379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204708.67381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204708.67383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204708.67385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204708.67693: variable 'network_connections' from source: task vars 49116 1727204708.67887: variable 'interface' from source: play vars 49116 1727204708.67979: variable 'interface' from source: play vars 49116 1727204708.67998: variable 'vlan_interface' from source: play vars 49116 1727204708.68237: variable 'vlan_interface' from source: play vars 49116 1727204708.68329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49116 1727204708.69096: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49116 1727204708.69324: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49116 1727204708.69377: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49116 1727204708.69417: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49116 1727204708.69486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49116 1727204708.69702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49116 1727204708.69742: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204708.69779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49116 1727204708.69849: variable '__network_team_connections_defined' from source: role '' defaults 49116 1727204708.70371: variable 'network_connections' from source: task vars 49116 1727204708.70584: variable 'interface' from source: play vars 49116 1727204708.70672: variable 'interface' from source: play vars 49116 1727204708.70971: variable 'vlan_interface' from source: play vars 49116 1727204708.70975: variable 'vlan_interface' from source: play vars 49116 1727204708.70999: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 49116 1727204708.71009: when evaluation is False, skipping this task 49116 1727204708.71018: _execute() done 49116 1727204708.71027: dumping result to json 49116 1727204708.71040: done dumping result, returning 49116 1727204708.71055: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-02f7-957b-000000000070] 49116 1727204708.71077: sending task result for task 127b8e07-fff9-02f7-957b-000000000070 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 49116 1727204708.71242: no more pending results, returning what we have 49116 1727204708.71246: results queue empty 49116 1727204708.71247: checking for any_errors_fatal 49116 1727204708.71257: done checking for any_errors_fatal 49116 1727204708.71258: checking for max_fail_percentage 49116 1727204708.71260: done checking for max_fail_percentage 49116 1727204708.71261: checking to see if all hosts have failed and the running result is not ok 49116 1727204708.71262: done checking to see if all hosts have failed 49116 1727204708.71263: getting the remaining hosts for this loop 49116 1727204708.71264: done getting the remaining hosts for this loop 49116 1727204708.71272: getting the next task for host managed-node3 49116 1727204708.71321: done getting next task for host managed-node3 49116 1727204708.71327: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 49116 1727204708.71330: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204708.71351: getting variables 49116 1727204708.71353: in VariableManager get_vars() 49116 1727204708.71488: Calling all_inventory to load vars for managed-node3 49116 1727204708.71491: Calling groups_inventory to load vars for managed-node3 49116 1727204708.71493: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204708.71511: Calling all_plugins_play to load vars for managed-node3 49116 1727204708.71514: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204708.71571: done sending task result for task 127b8e07-fff9-02f7-957b-000000000070 49116 1727204708.71575: WORKER PROCESS EXITING 49116 1727204708.71579: Calling groups_plugins_play to load vars for managed-node3 49116 1727204708.76422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204708.80862: done with get_vars() 49116 1727204708.80906: done getting variables 49116 1727204708.81188: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:05:08 -0400 (0:00:00.258) 0:00:31.837 ***** 49116 1727204708.81235: entering _queue_task() for managed-node3/service 49116 1727204708.82061: worker is 1 (out of 1 available) 49116 1727204708.82079: exiting _queue_task() for managed-node3/service 49116 1727204708.82096: done queuing things up, now waiting for results queue to drain 49116 1727204708.82097: waiting for pending results... 49116 1727204708.82887: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 49116 1727204708.82972: in run() - task 127b8e07-fff9-02f7-957b-000000000071 49116 1727204708.83173: variable 'ansible_search_path' from source: unknown 49116 1727204708.83178: variable 'ansible_search_path' from source: unknown 49116 1727204708.83181: calling self._execute() 49116 1727204708.83672: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204708.83677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204708.83681: variable 'omit' from source: magic vars 49116 1727204708.84260: variable 'ansible_distribution_major_version' from source: facts 49116 1727204708.84491: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204708.84897: variable 'network_provider' from source: set_fact 49116 1727204708.84911: variable 'network_state' from source: role '' defaults 49116 1727204708.84931: Evaluated conditional (network_provider == "nm" or network_state != {}): True 49116 1727204708.84947: variable 'omit' from source: magic vars 49116 1727204708.85022: variable 'omit' from source: magic vars 49116 1727204708.85313: variable 'network_service_name' from source: role '' defaults 49116 1727204708.85473: variable 'network_service_name' from source: role '' defaults 49116 1727204708.85543: variable '__network_provider_setup' from source: role '' defaults 49116 1727204708.85973: variable '__network_service_name_default_nm' from source: role '' defaults 49116 1727204708.85977: variable '__network_service_name_default_nm' from source: role '' defaults 49116 1727204708.85980: variable '__network_packages_default_nm' from source: role '' defaults 49116 1727204708.85982: variable '__network_packages_default_nm' from source: role '' defaults 49116 1727204708.86505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204708.90121: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204708.90214: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204708.90263: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204708.90316: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204708.90350: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204708.90446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204708.90488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204708.90524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204708.90581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204708.90600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204708.90662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204708.90700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204708.90741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204708.90793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204708.90813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204708.91096: variable '__network_packages_default_gobject_packages' from source: role '' defaults 49116 1727204708.91252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204708.91289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204708.91322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204708.91383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204708.91441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204708.91517: variable 'ansible_python' from source: facts 49116 1727204708.91556: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 49116 1727204708.91670: variable '__network_wpa_supplicant_required' from source: role '' defaults 49116 1727204708.91778: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 49116 1727204708.91942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204708.91983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204708.92027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204708.92071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204708.92139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204708.92173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204708.92215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204708.92252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204708.92309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204708.92353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204708.92512: variable 'network_connections' from source: task vars 49116 1727204708.92530: variable 'interface' from source: play vars 49116 1727204708.92627: variable 'interface' from source: play vars 49116 1727204708.92748: variable 'vlan_interface' from source: play vars 49116 1727204708.92752: variable 'vlan_interface' from source: play vars 49116 1727204708.92884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49116 1727204708.93125: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49116 1727204708.93192: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49116 1727204708.93240: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49116 1727204708.93292: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49116 1727204708.93477: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49116 1727204708.93672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49116 1727204708.93675: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204708.93677: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49116 1727204708.93745: variable '__network_wireless_connections_defined' from source: role '' defaults 49116 1727204708.94672: variable 'network_connections' from source: task vars 49116 1727204708.94676: variable 'interface' from source: play vars 49116 1727204708.94782: variable 'interface' from source: play vars 49116 1727204708.94804: variable 'vlan_interface' from source: play vars 49116 1727204708.94893: variable 'vlan_interface' from source: play vars 49116 1727204708.95055: variable '__network_packages_default_wireless' from source: role '' defaults 49116 1727204708.95202: variable '__network_wireless_connections_defined' from source: role '' defaults 49116 1727204708.95914: variable 'network_connections' from source: task vars 49116 1727204708.95980: variable 'interface' from source: play vars 49116 1727204708.96071: variable 'interface' from source: play vars 49116 1727204708.96271: variable 'vlan_interface' from source: play vars 49116 1727204708.96301: variable 'vlan_interface' from source: play vars 49116 1727204708.96399: variable '__network_packages_default_team' from source: role '' defaults 49116 1727204708.96600: variable '__network_team_connections_defined' from source: role '' defaults 49116 1727204708.97298: variable 'network_connections' from source: task vars 49116 1727204708.97315: variable 'interface' from source: play vars 49116 1727204708.97456: variable 'interface' from source: play vars 49116 1727204708.97540: variable 'vlan_interface' from source: play vars 49116 1727204708.97697: variable 'vlan_interface' from source: play vars 49116 1727204708.97898: variable '__network_service_name_default_initscripts' from source: role '' defaults 49116 1727204708.98172: variable '__network_service_name_default_initscripts' from source: role '' defaults 49116 1727204708.98176: variable '__network_packages_default_initscripts' from source: role '' defaults 49116 1727204708.98179: variable '__network_packages_default_initscripts' from source: role '' defaults 49116 1727204708.98762: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 49116 1727204709.00052: variable 'network_connections' from source: task vars 49116 1727204709.00064: variable 'interface' from source: play vars 49116 1727204709.00343: variable 'interface' from source: play vars 49116 1727204709.00346: variable 'vlan_interface' from source: play vars 49116 1727204709.00349: variable 'vlan_interface' from source: play vars 49116 1727204709.00560: variable 'ansible_distribution' from source: facts 49116 1727204709.00564: variable '__network_rh_distros' from source: role '' defaults 49116 1727204709.00569: variable 'ansible_distribution_major_version' from source: facts 49116 1727204709.00571: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 49116 1727204709.00898: variable 'ansible_distribution' from source: facts 49116 1727204709.00908: variable '__network_rh_distros' from source: role '' defaults 49116 1727204709.00919: variable 'ansible_distribution_major_version' from source: facts 49116 1727204709.00931: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 49116 1727204709.01636: variable 'ansible_distribution' from source: facts 49116 1727204709.01744: variable '__network_rh_distros' from source: role '' defaults 49116 1727204709.01747: variable 'ansible_distribution_major_version' from source: facts 49116 1727204709.01750: variable 'network_provider' from source: set_fact 49116 1727204709.01752: variable 'omit' from source: magic vars 49116 1727204709.01778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204709.01814: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204709.01843: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204709.01873: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204709.01976: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204709.02012: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204709.02023: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204709.02031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204709.02259: Set connection var ansible_connection to ssh 49116 1727204709.02308: Set connection var ansible_timeout to 10 49116 1727204709.02322: Set connection var ansible_shell_executable to /bin/sh 49116 1727204709.02508: Set connection var ansible_pipelining to False 49116 1727204709.02512: Set connection var ansible_shell_type to sh 49116 1727204709.02514: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204709.02516: variable 'ansible_shell_executable' from source: unknown 49116 1727204709.02519: variable 'ansible_connection' from source: unknown 49116 1727204709.02521: variable 'ansible_module_compression' from source: unknown 49116 1727204709.02523: variable 'ansible_shell_type' from source: unknown 49116 1727204709.02525: variable 'ansible_shell_executable' from source: unknown 49116 1727204709.02535: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204709.02538: variable 'ansible_pipelining' from source: unknown 49116 1727204709.02540: variable 'ansible_timeout' from source: unknown 49116 1727204709.02542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204709.02779: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204709.02854: variable 'omit' from source: magic vars 49116 1727204709.02868: starting attempt loop 49116 1727204709.02971: running the handler 49116 1727204709.03167: variable 'ansible_facts' from source: unknown 49116 1727204709.05007: _low_level_execute_command(): starting 49116 1727204709.05012: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204709.06274: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204709.06299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204709.06304: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204709.06341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204709.06452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204709.06590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204709.06693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204709.08572: stdout chunk (state=3): >>>/root <<< 49116 1727204709.08747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204709.08763: stdout chunk (state=3): >>><<< 49116 1727204709.08779: stderr chunk (state=3): >>><<< 49116 1727204709.08825: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204709.08845: _low_level_execute_command(): starting 49116 1727204709.08858: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204709.0883179-50893-113799389166703 `" && echo ansible-tmp-1727204709.0883179-50893-113799389166703="` echo /root/.ansible/tmp/ansible-tmp-1727204709.0883179-50893-113799389166703 `" ) && sleep 0' 49116 1727204709.09547: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204709.09554: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204709.09568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204709.09592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204709.09603: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204709.09610: stderr chunk (state=3): >>>debug2: match not found <<< 49116 1727204709.09619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204709.09642: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 49116 1727204709.09649: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 49116 1727204709.09656: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 49116 1727204709.09663: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204709.09675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204709.09687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204709.09694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204709.09700: stderr chunk (state=3): >>>debug2: match found <<< 49116 1727204709.09709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204709.09787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204709.09797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204709.09958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204709.12127: stdout chunk (state=3): >>>ansible-tmp-1727204709.0883179-50893-113799389166703=/root/.ansible/tmp/ansible-tmp-1727204709.0883179-50893-113799389166703 <<< 49116 1727204709.12352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204709.12356: stdout chunk (state=3): >>><<< 49116 1727204709.12359: stderr chunk (state=3): >>><<< 49116 1727204709.12382: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204709.0883179-50893-113799389166703=/root/.ansible/tmp/ansible-tmp-1727204709.0883179-50893-113799389166703 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204709.12459: variable 'ansible_module_compression' from source: unknown 49116 1727204709.12503: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 49116 1727204709.12772: variable 'ansible_facts' from source: unknown 49116 1727204709.12790: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204709.0883179-50893-113799389166703/AnsiballZ_systemd.py 49116 1727204709.13020: Sending initial data 49116 1727204709.13030: Sent initial data (156 bytes) 49116 1727204709.13709: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204709.13725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204709.13739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204709.13872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204709.13880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204709.13912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204709.14012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204709.15833: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204709.15919: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204709.16105: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpxhq1qu5m /root/.ansible/tmp/ansible-tmp-1727204709.0883179-50893-113799389166703/AnsiballZ_systemd.py <<< 49116 1727204709.16152: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204709.0883179-50893-113799389166703/AnsiballZ_systemd.py" <<< 49116 1727204709.16157: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpxhq1qu5m" to remote "/root/.ansible/tmp/ansible-tmp-1727204709.0883179-50893-113799389166703/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204709.0883179-50893-113799389166703/AnsiballZ_systemd.py" <<< 49116 1727204709.18397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204709.18401: stdout chunk (state=3): >>><<< 49116 1727204709.18404: stderr chunk (state=3): >>><<< 49116 1727204709.18406: done transferring module to remote 49116 1727204709.18409: _low_level_execute_command(): starting 49116 1727204709.18411: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204709.0883179-50893-113799389166703/ /root/.ansible/tmp/ansible-tmp-1727204709.0883179-50893-113799389166703/AnsiballZ_systemd.py && sleep 0' 49116 1727204709.20070: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204709.20076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204709.20101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 49116 1727204709.20105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 49116 1727204709.20107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204709.20185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204709.20228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204709.20247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204709.20275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204709.20407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204709.22462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204709.22576: stderr chunk (state=3): >>><<< 49116 1727204709.22580: stdout chunk (state=3): >>><<< 49116 1727204709.22600: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204709.22603: _low_level_execute_command(): starting 49116 1727204709.22609: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204709.0883179-50893-113799389166703/AnsiballZ_systemd.py && sleep 0' 49116 1727204709.23565: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204709.23572: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204709.23574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204709.23581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204709.23584: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204709.23586: stderr chunk (state=3): >>>debug2: match not found <<< 49116 1727204709.23588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204709.23592: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204709.23594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204709.23596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204709.23598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204709.23683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204709.57403: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "75711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 15:04:14 EDT", "ExecMainStartTimestampMonotonic": "992436115", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "75711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Tue 2024-09-24 15:04:14 EDT] ; stop_time=[n/a] ; pid=75711 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Tue 2024-09-24 15:04:14 EDT] ; stop_time=[n/a] ; pid=75711 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "7558", "MemoryCurrent": "3997696", "MemoryPeak": "4870144", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3513675776", "CPUUsageNSec": "182825000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "Limi<<< 49116 1727204709.57419: stdout chunk (state=3): >>>tSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target cloud-init.service network.target multi-user.target NetworkManager-wait-online.service", "After": "systemd-journald.socket dbus.socket cloud-init-local.service sysinit.target basic.target system.slice network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", <<< 49116 1727204709.57437: stdout chunk (state=3): >>>"Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:04:15 EDT", "StateChangeTimestampMonotonic": "992533498", "InactiveExitTimestamp": "Tue 2024-09-24 15:04:14 EDT", "InactiveExitTimestampMonotonic": "992436398", "ActiveEnterTimestamp": "Tue 2024-09-24 15:04:15 EDT", "ActiveEnterTimestampMonotonic": "992533498", "ActiveExitTimestamp": "Tue 2024-09-24 15:04:14 EDT", "ActiveExitTimestampMonotonic": "992357833", "InactiveEnterTimestamp": "Tue 2024-09-24 15:04:14 EDT", "InactiveEnterTimestampMonotonic": "992431355", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 15:04:14 EDT", "ConditionTimestampMonotonic": "992432565", "AssertTimestamp": "Tue 2024-09-24 15:04:14 EDT", "AssertTimestampMonotonic": "992432569", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "149c166e8026437d99b665831d791274", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 49116 1727204709.59607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204709.59676: stderr chunk (state=3): >>><<< 49116 1727204709.59680: stdout chunk (state=3): >>><<< 49116 1727204709.59695: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "75711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 15:04:14 EDT", "ExecMainStartTimestampMonotonic": "992436115", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "75711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Tue 2024-09-24 15:04:14 EDT] ; stop_time=[n/a] ; pid=75711 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Tue 2024-09-24 15:04:14 EDT] ; stop_time=[n/a] ; pid=75711 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "7558", "MemoryCurrent": "3997696", "MemoryPeak": "4870144", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3513675776", "CPUUsageNSec": "182825000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target cloud-init.service network.target multi-user.target NetworkManager-wait-online.service", "After": "systemd-journald.socket dbus.socket cloud-init-local.service sysinit.target basic.target system.slice network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:04:15 EDT", "StateChangeTimestampMonotonic": "992533498", "InactiveExitTimestamp": "Tue 2024-09-24 15:04:14 EDT", "InactiveExitTimestampMonotonic": "992436398", "ActiveEnterTimestamp": "Tue 2024-09-24 15:04:15 EDT", "ActiveEnterTimestampMonotonic": "992533498", "ActiveExitTimestamp": "Tue 2024-09-24 15:04:14 EDT", "ActiveExitTimestampMonotonic": "992357833", "InactiveEnterTimestamp": "Tue 2024-09-24 15:04:14 EDT", "InactiveEnterTimestampMonotonic": "992431355", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 15:04:14 EDT", "ConditionTimestampMonotonic": "992432565", "AssertTimestamp": "Tue 2024-09-24 15:04:14 EDT", "AssertTimestampMonotonic": "992432569", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "149c166e8026437d99b665831d791274", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204709.59835: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204709.0883179-50893-113799389166703/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204709.59851: _low_level_execute_command(): starting 49116 1727204709.59856: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204709.0883179-50893-113799389166703/ > /dev/null 2>&1 && sleep 0' 49116 1727204709.60373: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204709.60378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204709.60381: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204709.60383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 49116 1727204709.60386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204709.60439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204709.60442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204709.60518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204709.62567: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204709.62674: stderr chunk (state=3): >>><<< 49116 1727204709.62678: stdout chunk (state=3): >>><<< 49116 1727204709.62681: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204709.62684: handler run complete 49116 1727204709.62707: attempt loop complete, returning result 49116 1727204709.62710: _execute() done 49116 1727204709.62713: dumping result to json 49116 1727204709.62725: done dumping result, returning 49116 1727204709.62741: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-02f7-957b-000000000071] 49116 1727204709.62745: sending task result for task 127b8e07-fff9-02f7-957b-000000000071 49116 1727204709.63026: done sending task result for task 127b8e07-fff9-02f7-957b-000000000071 49116 1727204709.63029: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 49116 1727204709.63097: no more pending results, returning what we have 49116 1727204709.63100: results queue empty 49116 1727204709.63101: checking for any_errors_fatal 49116 1727204709.63108: done checking for any_errors_fatal 49116 1727204709.63109: checking for max_fail_percentage 49116 1727204709.63111: done checking for max_fail_percentage 49116 1727204709.63111: checking to see if all hosts have failed and the running result is not ok 49116 1727204709.63112: done checking to see if all hosts have failed 49116 1727204709.63113: getting the remaining hosts for this loop 49116 1727204709.63114: done getting the remaining hosts for this loop 49116 1727204709.63118: getting the next task for host managed-node3 49116 1727204709.63124: done getting next task for host managed-node3 49116 1727204709.63127: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 49116 1727204709.63130: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204709.63145: getting variables 49116 1727204709.63147: in VariableManager get_vars() 49116 1727204709.63184: Calling all_inventory to load vars for managed-node3 49116 1727204709.63186: Calling groups_inventory to load vars for managed-node3 49116 1727204709.63188: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204709.63198: Calling all_plugins_play to load vars for managed-node3 49116 1727204709.63201: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204709.63204: Calling groups_plugins_play to load vars for managed-node3 49116 1727204709.64423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204709.66453: done with get_vars() 49116 1727204709.66500: done getting variables 49116 1727204709.66578: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:05:09 -0400 (0:00:00.853) 0:00:32.691 ***** 49116 1727204709.66622: entering _queue_task() for managed-node3/service 49116 1727204709.67032: worker is 1 (out of 1 available) 49116 1727204709.67048: exiting _queue_task() for managed-node3/service 49116 1727204709.67064: done queuing things up, now waiting for results queue to drain 49116 1727204709.67263: waiting for pending results... 49116 1727204709.67399: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 49116 1727204709.67520: in run() - task 127b8e07-fff9-02f7-957b-000000000072 49116 1727204709.67531: variable 'ansible_search_path' from source: unknown 49116 1727204709.67535: variable 'ansible_search_path' from source: unknown 49116 1727204709.67576: calling self._execute() 49116 1727204709.67668: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204709.67673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204709.67684: variable 'omit' from source: magic vars 49116 1727204709.68001: variable 'ansible_distribution_major_version' from source: facts 49116 1727204709.68012: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204709.68104: variable 'network_provider' from source: set_fact 49116 1727204709.68108: Evaluated conditional (network_provider == "nm"): True 49116 1727204709.68185: variable '__network_wpa_supplicant_required' from source: role '' defaults 49116 1727204709.68255: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 49116 1727204709.68397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204709.70552: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204709.70616: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204709.70650: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204709.70684: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204709.70709: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204709.70923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204709.70947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204709.70968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204709.70997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204709.71010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204709.71054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204709.71073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204709.71091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204709.71120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204709.71133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204709.71169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204709.71188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204709.71205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204709.71240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204709.71251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204709.71359: variable 'network_connections' from source: task vars 49116 1727204709.71373: variable 'interface' from source: play vars 49116 1727204709.71428: variable 'interface' from source: play vars 49116 1727204709.71440: variable 'vlan_interface' from source: play vars 49116 1727204709.71491: variable 'vlan_interface' from source: play vars 49116 1727204709.71551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49116 1727204709.71684: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49116 1727204709.71714: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49116 1727204709.71737: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49116 1727204709.71760: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49116 1727204709.71802: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49116 1727204709.71818: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49116 1727204709.71839: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204709.71858: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49116 1727204709.71904: variable '__network_wireless_connections_defined' from source: role '' defaults 49116 1727204709.72097: variable 'network_connections' from source: task vars 49116 1727204709.72101: variable 'interface' from source: play vars 49116 1727204709.72153: variable 'interface' from source: play vars 49116 1727204709.72160: variable 'vlan_interface' from source: play vars 49116 1727204709.72207: variable 'vlan_interface' from source: play vars 49116 1727204709.72233: Evaluated conditional (__network_wpa_supplicant_required): False 49116 1727204709.72239: when evaluation is False, skipping this task 49116 1727204709.72251: _execute() done 49116 1727204709.72254: dumping result to json 49116 1727204709.72257: done dumping result, returning 49116 1727204709.72259: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-02f7-957b-000000000072] 49116 1727204709.72261: sending task result for task 127b8e07-fff9-02f7-957b-000000000072 49116 1727204709.72360: done sending task result for task 127b8e07-fff9-02f7-957b-000000000072 49116 1727204709.72363: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 49116 1727204709.72416: no more pending results, returning what we have 49116 1727204709.72419: results queue empty 49116 1727204709.72421: checking for any_errors_fatal 49116 1727204709.72448: done checking for any_errors_fatal 49116 1727204709.72449: checking for max_fail_percentage 49116 1727204709.72451: done checking for max_fail_percentage 49116 1727204709.72452: checking to see if all hosts have failed and the running result is not ok 49116 1727204709.72453: done checking to see if all hosts have failed 49116 1727204709.72454: getting the remaining hosts for this loop 49116 1727204709.72455: done getting the remaining hosts for this loop 49116 1727204709.72460: getting the next task for host managed-node3 49116 1727204709.72469: done getting next task for host managed-node3 49116 1727204709.72473: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 49116 1727204709.72477: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204709.72498: getting variables 49116 1727204709.72499: in VariableManager get_vars() 49116 1727204709.72541: Calling all_inventory to load vars for managed-node3 49116 1727204709.72543: Calling groups_inventory to load vars for managed-node3 49116 1727204709.72545: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204709.72556: Calling all_plugins_play to load vars for managed-node3 49116 1727204709.72559: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204709.72563: Calling groups_plugins_play to load vars for managed-node3 49116 1727204709.73705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204709.74912: done with get_vars() 49116 1727204709.74944: done getting variables 49116 1727204709.74999: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:05:09 -0400 (0:00:00.084) 0:00:32.775 ***** 49116 1727204709.75027: entering _queue_task() for managed-node3/service 49116 1727204709.75324: worker is 1 (out of 1 available) 49116 1727204709.75342: exiting _queue_task() for managed-node3/service 49116 1727204709.75355: done queuing things up, now waiting for results queue to drain 49116 1727204709.75356: waiting for pending results... 49116 1727204709.75571: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 49116 1727204709.75675: in run() - task 127b8e07-fff9-02f7-957b-000000000073 49116 1727204709.75692: variable 'ansible_search_path' from source: unknown 49116 1727204709.75696: variable 'ansible_search_path' from source: unknown 49116 1727204709.75729: calling self._execute() 49116 1727204709.75818: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204709.75822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204709.75827: variable 'omit' from source: magic vars 49116 1727204709.76138: variable 'ansible_distribution_major_version' from source: facts 49116 1727204709.76148: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204709.76237: variable 'network_provider' from source: set_fact 49116 1727204709.76248: Evaluated conditional (network_provider == "initscripts"): False 49116 1727204709.76252: when evaluation is False, skipping this task 49116 1727204709.76260: _execute() done 49116 1727204709.76263: dumping result to json 49116 1727204709.76267: done dumping result, returning 49116 1727204709.76275: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-02f7-957b-000000000073] 49116 1727204709.76278: sending task result for task 127b8e07-fff9-02f7-957b-000000000073 49116 1727204709.76379: done sending task result for task 127b8e07-fff9-02f7-957b-000000000073 49116 1727204709.76382: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 49116 1727204709.76432: no more pending results, returning what we have 49116 1727204709.76436: results queue empty 49116 1727204709.76437: checking for any_errors_fatal 49116 1727204709.76449: done checking for any_errors_fatal 49116 1727204709.76450: checking for max_fail_percentage 49116 1727204709.76452: done checking for max_fail_percentage 49116 1727204709.76453: checking to see if all hosts have failed and the running result is not ok 49116 1727204709.76454: done checking to see if all hosts have failed 49116 1727204709.76455: getting the remaining hosts for this loop 49116 1727204709.76456: done getting the remaining hosts for this loop 49116 1727204709.76461: getting the next task for host managed-node3 49116 1727204709.76472: done getting next task for host managed-node3 49116 1727204709.76475: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 49116 1727204709.76479: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204709.76503: getting variables 49116 1727204709.76505: in VariableManager get_vars() 49116 1727204709.76547: Calling all_inventory to load vars for managed-node3 49116 1727204709.76550: Calling groups_inventory to load vars for managed-node3 49116 1727204709.76552: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204709.76563: Calling all_plugins_play to load vars for managed-node3 49116 1727204709.76574: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204709.76578: Calling groups_plugins_play to load vars for managed-node3 49116 1727204709.77603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204709.79748: done with get_vars() 49116 1727204709.79785: done getting variables 49116 1727204709.79857: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:05:09 -0400 (0:00:00.048) 0:00:32.823 ***** 49116 1727204709.79898: entering _queue_task() for managed-node3/copy 49116 1727204709.80309: worker is 1 (out of 1 available) 49116 1727204709.80323: exiting _queue_task() for managed-node3/copy 49116 1727204709.80338: done queuing things up, now waiting for results queue to drain 49116 1727204709.80339: waiting for pending results... 49116 1727204709.80794: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 49116 1727204709.80873: in run() - task 127b8e07-fff9-02f7-957b-000000000074 49116 1727204709.80900: variable 'ansible_search_path' from source: unknown 49116 1727204709.80911: variable 'ansible_search_path' from source: unknown 49116 1727204709.80960: calling self._execute() 49116 1727204709.81087: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204709.81105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204709.81171: variable 'omit' from source: magic vars 49116 1727204709.81568: variable 'ansible_distribution_major_version' from source: facts 49116 1727204709.81589: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204709.81729: variable 'network_provider' from source: set_fact 49116 1727204709.81742: Evaluated conditional (network_provider == "initscripts"): False 49116 1727204709.81757: when evaluation is False, skipping this task 49116 1727204709.81771: _execute() done 49116 1727204709.81876: dumping result to json 49116 1727204709.81880: done dumping result, returning 49116 1727204709.81884: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-02f7-957b-000000000074] 49116 1727204709.81887: sending task result for task 127b8e07-fff9-02f7-957b-000000000074 49116 1727204709.82084: done sending task result for task 127b8e07-fff9-02f7-957b-000000000074 49116 1727204709.82088: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 49116 1727204709.82147: no more pending results, returning what we have 49116 1727204709.82152: results queue empty 49116 1727204709.82153: checking for any_errors_fatal 49116 1727204709.82173: done checking for any_errors_fatal 49116 1727204709.82174: checking for max_fail_percentage 49116 1727204709.82177: done checking for max_fail_percentage 49116 1727204709.82178: checking to see if all hosts have failed and the running result is not ok 49116 1727204709.82179: done checking to see if all hosts have failed 49116 1727204709.82180: getting the remaining hosts for this loop 49116 1727204709.82181: done getting the remaining hosts for this loop 49116 1727204709.82187: getting the next task for host managed-node3 49116 1727204709.82195: done getting next task for host managed-node3 49116 1727204709.82200: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 49116 1727204709.82205: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204709.82232: getting variables 49116 1727204709.82234: in VariableManager get_vars() 49116 1727204709.82386: Calling all_inventory to load vars for managed-node3 49116 1727204709.82390: Calling groups_inventory to load vars for managed-node3 49116 1727204709.82393: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204709.82406: Calling all_plugins_play to load vars for managed-node3 49116 1727204709.82409: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204709.82413: Calling groups_plugins_play to load vars for managed-node3 49116 1727204709.83712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204709.84927: done with get_vars() 49116 1727204709.84959: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:05:09 -0400 (0:00:00.051) 0:00:32.875 ***** 49116 1727204709.85039: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 49116 1727204709.85336: worker is 1 (out of 1 available) 49116 1727204709.85352: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 49116 1727204709.85369: done queuing things up, now waiting for results queue to drain 49116 1727204709.85371: waiting for pending results... 49116 1727204709.85585: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 49116 1727204709.85691: in run() - task 127b8e07-fff9-02f7-957b-000000000075 49116 1727204709.85711: variable 'ansible_search_path' from source: unknown 49116 1727204709.85715: variable 'ansible_search_path' from source: unknown 49116 1727204709.85746: calling self._execute() 49116 1727204709.85836: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204709.85844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204709.85853: variable 'omit' from source: magic vars 49116 1727204709.86170: variable 'ansible_distribution_major_version' from source: facts 49116 1727204709.86180: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204709.86186: variable 'omit' from source: magic vars 49116 1727204709.86238: variable 'omit' from source: magic vars 49116 1727204709.86379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204709.88049: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204709.88106: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204709.88130: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204709.88160: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204709.88181: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204709.88254: variable 'network_provider' from source: set_fact 49116 1727204709.88371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204709.88707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204709.88727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204709.88764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204709.88778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204709.88847: variable 'omit' from source: magic vars 49116 1727204709.88942: variable 'omit' from source: magic vars 49116 1727204709.89023: variable 'network_connections' from source: task vars 49116 1727204709.89033: variable 'interface' from source: play vars 49116 1727204709.89085: variable 'interface' from source: play vars 49116 1727204709.89092: variable 'vlan_interface' from source: play vars 49116 1727204709.89141: variable 'vlan_interface' from source: play vars 49116 1727204709.89259: variable 'omit' from source: magic vars 49116 1727204709.89267: variable '__lsr_ansible_managed' from source: task vars 49116 1727204709.89315: variable '__lsr_ansible_managed' from source: task vars 49116 1727204709.89533: Loaded config def from plugin (lookup/template) 49116 1727204709.89540: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 49116 1727204709.89563: File lookup term: get_ansible_managed.j2 49116 1727204709.89569: variable 'ansible_search_path' from source: unknown 49116 1727204709.89572: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 49116 1727204709.89584: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 49116 1727204709.89600: variable 'ansible_search_path' from source: unknown 49116 1727204709.99199: variable 'ansible_managed' from source: unknown 49116 1727204709.99311: variable 'omit' from source: magic vars 49116 1727204709.99332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204709.99353: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204709.99367: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204709.99380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204709.99391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204709.99409: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204709.99413: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204709.99416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204709.99480: Set connection var ansible_connection to ssh 49116 1727204709.99491: Set connection var ansible_timeout to 10 49116 1727204709.99499: Set connection var ansible_shell_executable to /bin/sh 49116 1727204709.99506: Set connection var ansible_pipelining to False 49116 1727204709.99509: Set connection var ansible_shell_type to sh 49116 1727204709.99517: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204709.99536: variable 'ansible_shell_executable' from source: unknown 49116 1727204709.99539: variable 'ansible_connection' from source: unknown 49116 1727204709.99542: variable 'ansible_module_compression' from source: unknown 49116 1727204709.99544: variable 'ansible_shell_type' from source: unknown 49116 1727204709.99547: variable 'ansible_shell_executable' from source: unknown 49116 1727204709.99550: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204709.99552: variable 'ansible_pipelining' from source: unknown 49116 1727204709.99554: variable 'ansible_timeout' from source: unknown 49116 1727204709.99560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204709.99664: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49116 1727204709.99677: variable 'omit' from source: magic vars 49116 1727204709.99680: starting attempt loop 49116 1727204709.99683: running the handler 49116 1727204709.99693: _low_level_execute_command(): starting 49116 1727204709.99698: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204710.00251: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204710.00260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204710.00263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204710.00268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204710.00320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204710.00323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204710.00326: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204710.00410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204710.02280: stdout chunk (state=3): >>>/root <<< 49116 1727204710.02383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204710.02453: stderr chunk (state=3): >>><<< 49116 1727204710.02457: stdout chunk (state=3): >>><<< 49116 1727204710.02479: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204710.02490: _low_level_execute_command(): starting 49116 1727204710.02497: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204710.0247984-50932-148760715832792 `" && echo ansible-tmp-1727204710.0247984-50932-148760715832792="` echo /root/.ansible/tmp/ansible-tmp-1727204710.0247984-50932-148760715832792 `" ) && sleep 0' 49116 1727204710.03047: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204710.03051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204710.03054: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204710.03056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204710.03122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204710.03125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204710.03126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204710.03191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204710.05368: stdout chunk (state=3): >>>ansible-tmp-1727204710.0247984-50932-148760715832792=/root/.ansible/tmp/ansible-tmp-1727204710.0247984-50932-148760715832792 <<< 49116 1727204710.05586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204710.05589: stdout chunk (state=3): >>><<< 49116 1727204710.05592: stderr chunk (state=3): >>><<< 49116 1727204710.05647: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204710.0247984-50932-148760715832792=/root/.ansible/tmp/ansible-tmp-1727204710.0247984-50932-148760715832792 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204710.05673: variable 'ansible_module_compression' from source: unknown 49116 1727204710.05728: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 49116 1727204710.05771: variable 'ansible_facts' from source: unknown 49116 1727204710.05865: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204710.0247984-50932-148760715832792/AnsiballZ_network_connections.py 49116 1727204710.05983: Sending initial data 49116 1727204710.05987: Sent initial data (168 bytes) 49116 1727204710.06500: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204710.06505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49116 1727204710.06512: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204710.06514: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204710.06564: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204710.06570: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204710.06573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204710.06651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204710.08478: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204710.08552: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204710.08629: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpigb_jci1 /root/.ansible/tmp/ansible-tmp-1727204710.0247984-50932-148760715832792/AnsiballZ_network_connections.py <<< 49116 1727204710.08633: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204710.0247984-50932-148760715832792/AnsiballZ_network_connections.py" <<< 49116 1727204710.08705: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpigb_jci1" to remote "/root/.ansible/tmp/ansible-tmp-1727204710.0247984-50932-148760715832792/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204710.0247984-50932-148760715832792/AnsiballZ_network_connections.py" <<< 49116 1727204710.09931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204710.10082: stderr chunk (state=3): >>><<< 49116 1727204710.10086: stdout chunk (state=3): >>><<< 49116 1727204710.10088: done transferring module to remote 49116 1727204710.10090: _low_level_execute_command(): starting 49116 1727204710.10093: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204710.0247984-50932-148760715832792/ /root/.ansible/tmp/ansible-tmp-1727204710.0247984-50932-148760715832792/AnsiballZ_network_connections.py && sleep 0' 49116 1727204710.10693: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204710.10817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204710.10845: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204710.10956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204710.13135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204710.13140: stdout chunk (state=3): >>><<< 49116 1727204710.13143: stderr chunk (state=3): >>><<< 49116 1727204710.13262: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204710.13271: _low_level_execute_command(): starting 49116 1727204710.13274: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204710.0247984-50932-148760715832792/AnsiballZ_network_connections.py && sleep 0' 49116 1727204710.13877: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204710.13896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204710.13972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204710.13987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204710.14033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204710.14053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204710.14081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204710.14203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204710.59218: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ryuomqqr/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ryuomqqr/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101/f99c432c-37f8-451d-93e7-dceda575981b: error=unknown <<< 49116 1727204710.61703: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ryuomqqr/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ryuomqqr/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101.90/04325438-476c-4a74-b269-34ef9c78263d: error=unknown <<< 49116 1727204710.61954: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 49116 1727204710.64098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204710.64169: stderr chunk (state=3): >>><<< 49116 1727204710.64173: stdout chunk (state=3): >>><<< 49116 1727204710.64189: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ryuomqqr/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ryuomqqr/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101/f99c432c-37f8-451d-93e7-dceda575981b: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ryuomqqr/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ryuomqqr/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101.90/04325438-476c-4a74-b269-34ef9c78263d: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204710.64230: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr101', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'lsr101.90', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204710.0247984-50932-148760715832792/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204710.64238: _low_level_execute_command(): starting 49116 1727204710.64244: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204710.0247984-50932-148760715832792/ > /dev/null 2>&1 && sleep 0' 49116 1727204710.64755: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204710.64759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204710.64764: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 49116 1727204710.64767: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204710.64769: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204710.64825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204710.64836: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204710.64839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204710.64910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204710.67017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204710.67083: stderr chunk (state=3): >>><<< 49116 1727204710.67087: stdout chunk (state=3): >>><<< 49116 1727204710.67100: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204710.67107: handler run complete 49116 1727204710.67128: attempt loop complete, returning result 49116 1727204710.67131: _execute() done 49116 1727204710.67134: dumping result to json 49116 1727204710.67141: done dumping result, returning 49116 1727204710.67153: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-02f7-957b-000000000075] 49116 1727204710.67156: sending task result for task 127b8e07-fff9-02f7-957b-000000000075 49116 1727204710.67275: done sending task result for task 127b8e07-fff9-02f7-957b-000000000075 49116 1727204710.67278: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr101", "persistent_state": "absent", "state": "down" }, { "name": "lsr101.90", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 49116 1727204710.67396: no more pending results, returning what we have 49116 1727204710.67401: results queue empty 49116 1727204710.67402: checking for any_errors_fatal 49116 1727204710.67410: done checking for any_errors_fatal 49116 1727204710.67411: checking for max_fail_percentage 49116 1727204710.67413: done checking for max_fail_percentage 49116 1727204710.67414: checking to see if all hosts have failed and the running result is not ok 49116 1727204710.67415: done checking to see if all hosts have failed 49116 1727204710.67415: getting the remaining hosts for this loop 49116 1727204710.67417: done getting the remaining hosts for this loop 49116 1727204710.67420: getting the next task for host managed-node3 49116 1727204710.67426: done getting next task for host managed-node3 49116 1727204710.67430: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 49116 1727204710.67433: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204710.67444: getting variables 49116 1727204710.67446: in VariableManager get_vars() 49116 1727204710.67494: Calling all_inventory to load vars for managed-node3 49116 1727204710.67497: Calling groups_inventory to load vars for managed-node3 49116 1727204710.67499: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204710.67508: Calling all_plugins_play to load vars for managed-node3 49116 1727204710.67511: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204710.67513: Calling groups_plugins_play to load vars for managed-node3 49116 1727204710.68780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204710.69970: done with get_vars() 49116 1727204710.70003: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:05:10 -0400 (0:00:00.850) 0:00:33.725 ***** 49116 1727204710.70082: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 49116 1727204710.70383: worker is 1 (out of 1 available) 49116 1727204710.70399: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 49116 1727204710.70414: done queuing things up, now waiting for results queue to drain 49116 1727204710.70416: waiting for pending results... 49116 1727204710.70629: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 49116 1727204710.70736: in run() - task 127b8e07-fff9-02f7-957b-000000000076 49116 1727204710.70752: variable 'ansible_search_path' from source: unknown 49116 1727204710.70759: variable 'ansible_search_path' from source: unknown 49116 1727204710.70801: calling self._execute() 49116 1727204710.70894: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204710.70898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204710.70908: variable 'omit' from source: magic vars 49116 1727204710.71227: variable 'ansible_distribution_major_version' from source: facts 49116 1727204710.71249: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204710.71338: variable 'network_state' from source: role '' defaults 49116 1727204710.71347: Evaluated conditional (network_state != {}): False 49116 1727204710.71350: when evaluation is False, skipping this task 49116 1727204710.71353: _execute() done 49116 1727204710.71356: dumping result to json 49116 1727204710.71359: done dumping result, returning 49116 1727204710.71369: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-02f7-957b-000000000076] 49116 1727204710.71373: sending task result for task 127b8e07-fff9-02f7-957b-000000000076 49116 1727204710.71472: done sending task result for task 127b8e07-fff9-02f7-957b-000000000076 49116 1727204710.71475: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 49116 1727204710.71530: no more pending results, returning what we have 49116 1727204710.71536: results queue empty 49116 1727204710.71538: checking for any_errors_fatal 49116 1727204710.71549: done checking for any_errors_fatal 49116 1727204710.71549: checking for max_fail_percentage 49116 1727204710.71552: done checking for max_fail_percentage 49116 1727204710.71553: checking to see if all hosts have failed and the running result is not ok 49116 1727204710.71554: done checking to see if all hosts have failed 49116 1727204710.71554: getting the remaining hosts for this loop 49116 1727204710.71556: done getting the remaining hosts for this loop 49116 1727204710.71562: getting the next task for host managed-node3 49116 1727204710.71571: done getting next task for host managed-node3 49116 1727204710.71575: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 49116 1727204710.71578: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204710.71599: getting variables 49116 1727204710.71600: in VariableManager get_vars() 49116 1727204710.71644: Calling all_inventory to load vars for managed-node3 49116 1727204710.71647: Calling groups_inventory to load vars for managed-node3 49116 1727204710.71649: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204710.71659: Calling all_plugins_play to load vars for managed-node3 49116 1727204710.71662: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204710.71664: Calling groups_plugins_play to load vars for managed-node3 49116 1727204710.72710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204710.74032: done with get_vars() 49116 1727204710.74055: done getting variables 49116 1727204710.74108: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:05:10 -0400 (0:00:00.040) 0:00:33.766 ***** 49116 1727204710.74140: entering _queue_task() for managed-node3/debug 49116 1727204710.74442: worker is 1 (out of 1 available) 49116 1727204710.74457: exiting _queue_task() for managed-node3/debug 49116 1727204710.74471: done queuing things up, now waiting for results queue to drain 49116 1727204710.74473: waiting for pending results... 49116 1727204710.74677: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 49116 1727204710.74783: in run() - task 127b8e07-fff9-02f7-957b-000000000077 49116 1727204710.74797: variable 'ansible_search_path' from source: unknown 49116 1727204710.74803: variable 'ansible_search_path' from source: unknown 49116 1727204710.74839: calling self._execute() 49116 1727204710.74922: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204710.74928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204710.74939: variable 'omit' from source: magic vars 49116 1727204710.75240: variable 'ansible_distribution_major_version' from source: facts 49116 1727204710.75250: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204710.75261: variable 'omit' from source: magic vars 49116 1727204710.75311: variable 'omit' from source: magic vars 49116 1727204710.75340: variable 'omit' from source: magic vars 49116 1727204710.75383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204710.75415: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204710.75432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204710.75451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204710.75462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204710.75491: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204710.75494: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204710.75497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204710.75572: Set connection var ansible_connection to ssh 49116 1727204710.75587: Set connection var ansible_timeout to 10 49116 1727204710.75594: Set connection var ansible_shell_executable to /bin/sh 49116 1727204710.75599: Set connection var ansible_pipelining to False 49116 1727204710.75602: Set connection var ansible_shell_type to sh 49116 1727204710.75607: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204710.75627: variable 'ansible_shell_executable' from source: unknown 49116 1727204710.75630: variable 'ansible_connection' from source: unknown 49116 1727204710.75633: variable 'ansible_module_compression' from source: unknown 49116 1727204710.75635: variable 'ansible_shell_type' from source: unknown 49116 1727204710.75640: variable 'ansible_shell_executable' from source: unknown 49116 1727204710.75643: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204710.75648: variable 'ansible_pipelining' from source: unknown 49116 1727204710.75651: variable 'ansible_timeout' from source: unknown 49116 1727204710.75655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204710.75778: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204710.75788: variable 'omit' from source: magic vars 49116 1727204710.75796: starting attempt loop 49116 1727204710.75799: running the handler 49116 1727204710.75911: variable '__network_connections_result' from source: set_fact 49116 1727204710.75960: handler run complete 49116 1727204710.75977: attempt loop complete, returning result 49116 1727204710.75980: _execute() done 49116 1727204710.75983: dumping result to json 49116 1727204710.75986: done dumping result, returning 49116 1727204710.75994: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-02f7-957b-000000000077] 49116 1727204710.75999: sending task result for task 127b8e07-fff9-02f7-957b-000000000077 49116 1727204710.76092: done sending task result for task 127b8e07-fff9-02f7-957b-000000000077 49116 1727204710.76095: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "" ] } 49116 1727204710.76170: no more pending results, returning what we have 49116 1727204710.76174: results queue empty 49116 1727204710.76175: checking for any_errors_fatal 49116 1727204710.76184: done checking for any_errors_fatal 49116 1727204710.76184: checking for max_fail_percentage 49116 1727204710.76186: done checking for max_fail_percentage 49116 1727204710.76187: checking to see if all hosts have failed and the running result is not ok 49116 1727204710.76188: done checking to see if all hosts have failed 49116 1727204710.76189: getting the remaining hosts for this loop 49116 1727204710.76190: done getting the remaining hosts for this loop 49116 1727204710.76195: getting the next task for host managed-node3 49116 1727204710.76201: done getting next task for host managed-node3 49116 1727204710.76212: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 49116 1727204710.76215: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204710.76227: getting variables 49116 1727204710.76229: in VariableManager get_vars() 49116 1727204710.76272: Calling all_inventory to load vars for managed-node3 49116 1727204710.76275: Calling groups_inventory to load vars for managed-node3 49116 1727204710.76277: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204710.76287: Calling all_plugins_play to load vars for managed-node3 49116 1727204710.76290: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204710.76292: Calling groups_plugins_play to load vars for managed-node3 49116 1727204710.81945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204710.83309: done with get_vars() 49116 1727204710.83349: done getting variables 49116 1727204710.83410: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:05:10 -0400 (0:00:00.093) 0:00:33.859 ***** 49116 1727204710.83444: entering _queue_task() for managed-node3/debug 49116 1727204710.83859: worker is 1 (out of 1 available) 49116 1727204710.83878: exiting _queue_task() for managed-node3/debug 49116 1727204710.83895: done queuing things up, now waiting for results queue to drain 49116 1727204710.83897: waiting for pending results... 49116 1727204710.84398: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 49116 1727204710.84597: in run() - task 127b8e07-fff9-02f7-957b-000000000078 49116 1727204710.84602: variable 'ansible_search_path' from source: unknown 49116 1727204710.84605: variable 'ansible_search_path' from source: unknown 49116 1727204710.84608: calling self._execute() 49116 1727204710.84739: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204710.84745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204710.84748: variable 'omit' from source: magic vars 49116 1727204710.85162: variable 'ansible_distribution_major_version' from source: facts 49116 1727204710.85192: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204710.85205: variable 'omit' from source: magic vars 49116 1727204710.85284: variable 'omit' from source: magic vars 49116 1727204710.85339: variable 'omit' from source: magic vars 49116 1727204710.85405: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204710.85454: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204710.85501: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204710.85521: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204710.85611: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204710.85617: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204710.85621: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204710.85623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204710.85735: Set connection var ansible_connection to ssh 49116 1727204710.85762: Set connection var ansible_timeout to 10 49116 1727204710.85829: Set connection var ansible_shell_executable to /bin/sh 49116 1727204710.85833: Set connection var ansible_pipelining to False 49116 1727204710.85835: Set connection var ansible_shell_type to sh 49116 1727204710.85837: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204710.85842: variable 'ansible_shell_executable' from source: unknown 49116 1727204710.85857: variable 'ansible_connection' from source: unknown 49116 1727204710.85869: variable 'ansible_module_compression' from source: unknown 49116 1727204710.85879: variable 'ansible_shell_type' from source: unknown 49116 1727204710.85887: variable 'ansible_shell_executable' from source: unknown 49116 1727204710.85896: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204710.85905: variable 'ansible_pipelining' from source: unknown 49116 1727204710.85937: variable 'ansible_timeout' from source: unknown 49116 1727204710.85941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204710.86123: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204710.86156: variable 'omit' from source: magic vars 49116 1727204710.86188: starting attempt loop 49116 1727204710.86192: running the handler 49116 1727204710.86248: variable '__network_connections_result' from source: set_fact 49116 1727204710.86374: variable '__network_connections_result' from source: set_fact 49116 1727204710.86515: handler run complete 49116 1727204710.86625: attempt loop complete, returning result 49116 1727204710.86629: _execute() done 49116 1727204710.86631: dumping result to json 49116 1727204710.86634: done dumping result, returning 49116 1727204710.86636: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-02f7-957b-000000000078] 49116 1727204710.86639: sending task result for task 127b8e07-fff9-02f7-957b-000000000078 49116 1727204710.86725: done sending task result for task 127b8e07-fff9-02f7-957b-000000000078 49116 1727204710.86731: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr101", "persistent_state": "absent", "state": "down" }, { "name": "lsr101.90", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 49116 1727204710.86839: no more pending results, returning what we have 49116 1727204710.86845: results queue empty 49116 1727204710.86846: checking for any_errors_fatal 49116 1727204710.86859: done checking for any_errors_fatal 49116 1727204710.86860: checking for max_fail_percentage 49116 1727204710.86862: done checking for max_fail_percentage 49116 1727204710.86863: checking to see if all hosts have failed and the running result is not ok 49116 1727204710.86864: done checking to see if all hosts have failed 49116 1727204710.86866: getting the remaining hosts for this loop 49116 1727204710.86868: done getting the remaining hosts for this loop 49116 1727204710.86873: getting the next task for host managed-node3 49116 1727204710.86880: done getting next task for host managed-node3 49116 1727204710.86885: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 49116 1727204710.86889: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204710.86905: getting variables 49116 1727204710.86907: in VariableManager get_vars() 49116 1727204710.86953: Calling all_inventory to load vars for managed-node3 49116 1727204710.86957: Calling groups_inventory to load vars for managed-node3 49116 1727204710.86959: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204710.87190: Calling all_plugins_play to load vars for managed-node3 49116 1727204710.87195: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204710.87200: Calling groups_plugins_play to load vars for managed-node3 49116 1727204710.88798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204710.90030: done with get_vars() 49116 1727204710.90057: done getting variables 49116 1727204710.90110: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:05:10 -0400 (0:00:00.066) 0:00:33.926 ***** 49116 1727204710.90144: entering _queue_task() for managed-node3/debug 49116 1727204710.90431: worker is 1 (out of 1 available) 49116 1727204710.90446: exiting _queue_task() for managed-node3/debug 49116 1727204710.90461: done queuing things up, now waiting for results queue to drain 49116 1727204710.90462: waiting for pending results... 49116 1727204710.90689: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 49116 1727204710.90790: in run() - task 127b8e07-fff9-02f7-957b-000000000079 49116 1727204710.90805: variable 'ansible_search_path' from source: unknown 49116 1727204710.90809: variable 'ansible_search_path' from source: unknown 49116 1727204710.90846: calling self._execute() 49116 1727204710.90940: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204710.90946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204710.90955: variable 'omit' from source: magic vars 49116 1727204710.91281: variable 'ansible_distribution_major_version' from source: facts 49116 1727204710.91290: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204710.91387: variable 'network_state' from source: role '' defaults 49116 1727204710.91396: Evaluated conditional (network_state != {}): False 49116 1727204710.91400: when evaluation is False, skipping this task 49116 1727204710.91402: _execute() done 49116 1727204710.91405: dumping result to json 49116 1727204710.91409: done dumping result, returning 49116 1727204710.91417: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-02f7-957b-000000000079] 49116 1727204710.91422: sending task result for task 127b8e07-fff9-02f7-957b-000000000079 49116 1727204710.91523: done sending task result for task 127b8e07-fff9-02f7-957b-000000000079 49116 1727204710.91526: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 49116 1727204710.91577: no more pending results, returning what we have 49116 1727204710.91581: results queue empty 49116 1727204710.91583: checking for any_errors_fatal 49116 1727204710.91591: done checking for any_errors_fatal 49116 1727204710.91591: checking for max_fail_percentage 49116 1727204710.91593: done checking for max_fail_percentage 49116 1727204710.91595: checking to see if all hosts have failed and the running result is not ok 49116 1727204710.91595: done checking to see if all hosts have failed 49116 1727204710.91596: getting the remaining hosts for this loop 49116 1727204710.91598: done getting the remaining hosts for this loop 49116 1727204710.91602: getting the next task for host managed-node3 49116 1727204710.91610: done getting next task for host managed-node3 49116 1727204710.91616: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 49116 1727204710.91619: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204710.91643: getting variables 49116 1727204710.91644: in VariableManager get_vars() 49116 1727204710.91690: Calling all_inventory to load vars for managed-node3 49116 1727204710.91693: Calling groups_inventory to load vars for managed-node3 49116 1727204710.91695: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204710.91707: Calling all_plugins_play to load vars for managed-node3 49116 1727204710.91709: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204710.91712: Calling groups_plugins_play to load vars for managed-node3 49116 1727204710.92734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204710.94392: done with get_vars() 49116 1727204710.94426: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:05:10 -0400 (0:00:00.043) 0:00:33.970 ***** 49116 1727204710.94538: entering _queue_task() for managed-node3/ping 49116 1727204710.94928: worker is 1 (out of 1 available) 49116 1727204710.94940: exiting _queue_task() for managed-node3/ping 49116 1727204710.94955: done queuing things up, now waiting for results queue to drain 49116 1727204710.94956: waiting for pending results... 49116 1727204710.95201: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 49116 1727204710.95322: in run() - task 127b8e07-fff9-02f7-957b-00000000007a 49116 1727204710.95334: variable 'ansible_search_path' from source: unknown 49116 1727204710.95338: variable 'ansible_search_path' from source: unknown 49116 1727204710.95374: calling self._execute() 49116 1727204710.95468: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204710.95472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204710.95481: variable 'omit' from source: magic vars 49116 1727204710.95806: variable 'ansible_distribution_major_version' from source: facts 49116 1727204710.95816: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204710.95829: variable 'omit' from source: magic vars 49116 1727204710.95876: variable 'omit' from source: magic vars 49116 1727204710.95903: variable 'omit' from source: magic vars 49116 1727204710.95944: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204710.95977: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204710.95995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204710.96010: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204710.96021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204710.96051: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204710.96056: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204710.96058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204710.96131: Set connection var ansible_connection to ssh 49116 1727204710.96144: Set connection var ansible_timeout to 10 49116 1727204710.96156: Set connection var ansible_shell_executable to /bin/sh 49116 1727204710.96159: Set connection var ansible_pipelining to False 49116 1727204710.96162: Set connection var ansible_shell_type to sh 49116 1727204710.96164: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204710.96187: variable 'ansible_shell_executable' from source: unknown 49116 1727204710.96190: variable 'ansible_connection' from source: unknown 49116 1727204710.96193: variable 'ansible_module_compression' from source: unknown 49116 1727204710.96196: variable 'ansible_shell_type' from source: unknown 49116 1727204710.96199: variable 'ansible_shell_executable' from source: unknown 49116 1727204710.96201: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204710.96203: variable 'ansible_pipelining' from source: unknown 49116 1727204710.96206: variable 'ansible_timeout' from source: unknown 49116 1727204710.96211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204710.96379: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49116 1727204710.96392: variable 'omit' from source: magic vars 49116 1727204710.96395: starting attempt loop 49116 1727204710.96398: running the handler 49116 1727204710.96411: _low_level_execute_command(): starting 49116 1727204710.96419: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204710.97087: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204710.97114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204710.97224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204710.99077: stdout chunk (state=3): >>>/root <<< 49116 1727204710.99268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204710.99290: stderr chunk (state=3): >>><<< 49116 1727204710.99299: stdout chunk (state=3): >>><<< 49116 1727204710.99330: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204710.99352: _low_level_execute_command(): starting 49116 1727204710.99364: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204710.9933805-50963-174989767182464 `" && echo ansible-tmp-1727204710.9933805-50963-174989767182464="` echo /root/.ansible/tmp/ansible-tmp-1727204710.9933805-50963-174989767182464 `" ) && sleep 0' 49116 1727204711.00072: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204711.00088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204711.00152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204711.00230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204711.00276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204711.00280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204711.00393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204711.02596: stdout chunk (state=3): >>>ansible-tmp-1727204710.9933805-50963-174989767182464=/root/.ansible/tmp/ansible-tmp-1727204710.9933805-50963-174989767182464 <<< 49116 1727204711.02818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204711.02822: stdout chunk (state=3): >>><<< 49116 1727204711.02825: stderr chunk (state=3): >>><<< 49116 1727204711.02847: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204710.9933805-50963-174989767182464=/root/.ansible/tmp/ansible-tmp-1727204710.9933805-50963-174989767182464 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204711.02972: variable 'ansible_module_compression' from source: unknown 49116 1727204711.02975: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 49116 1727204711.03017: variable 'ansible_facts' from source: unknown 49116 1727204711.03110: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204710.9933805-50963-174989767182464/AnsiballZ_ping.py 49116 1727204711.03264: Sending initial data 49116 1727204711.03393: Sent initial data (153 bytes) 49116 1727204711.03988: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204711.04020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 49116 1727204711.04086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204711.04140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204711.04156: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204711.04185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204711.04293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204711.06133: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204711.06193: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204711.06271: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpnxbto5ts /root/.ansible/tmp/ansible-tmp-1727204710.9933805-50963-174989767182464/AnsiballZ_ping.py <<< 49116 1727204711.06276: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204710.9933805-50963-174989767182464/AnsiballZ_ping.py" <<< 49116 1727204711.06376: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpnxbto5ts" to remote "/root/.ansible/tmp/ansible-tmp-1727204710.9933805-50963-174989767182464/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204710.9933805-50963-174989767182464/AnsiballZ_ping.py" <<< 49116 1727204711.07318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204711.07370: stderr chunk (state=3): >>><<< 49116 1727204711.07381: stdout chunk (state=3): >>><<< 49116 1727204711.07416: done transferring module to remote 49116 1727204711.07472: _low_level_execute_command(): starting 49116 1727204711.07476: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204710.9933805-50963-174989767182464/ /root/.ansible/tmp/ansible-tmp-1727204710.9933805-50963-174989767182464/AnsiballZ_ping.py && sleep 0' 49116 1727204711.08209: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204711.08237: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204711.08294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204711.08298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204711.08404: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204711.08407: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204711.08450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204711.08654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204711.10614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204711.10751: stderr chunk (state=3): >>><<< 49116 1727204711.10755: stdout chunk (state=3): >>><<< 49116 1727204711.10796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204711.10799: _low_level_execute_command(): starting 49116 1727204711.10802: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204710.9933805-50963-174989767182464/AnsiballZ_ping.py && sleep 0' 49116 1727204711.11586: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204711.11625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204711.11739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204711.29185: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 49116 1727204711.30747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204711.30811: stderr chunk (state=3): >>><<< 49116 1727204711.30816: stdout chunk (state=3): >>><<< 49116 1727204711.30835: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204711.30855: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204710.9933805-50963-174989767182464/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204711.30864: _low_level_execute_command(): starting 49116 1727204711.30871: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204710.9933805-50963-174989767182464/ > /dev/null 2>&1 && sleep 0' 49116 1727204711.31597: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204711.31618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204711.31636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204711.31745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204711.33825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204711.33887: stderr chunk (state=3): >>><<< 49116 1727204711.33893: stdout chunk (state=3): >>><<< 49116 1727204711.33911: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204711.33919: handler run complete 49116 1727204711.33931: attempt loop complete, returning result 49116 1727204711.33934: _execute() done 49116 1727204711.33939: dumping result to json 49116 1727204711.33941: done dumping result, returning 49116 1727204711.33951: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-02f7-957b-00000000007a] 49116 1727204711.33957: sending task result for task 127b8e07-fff9-02f7-957b-00000000007a 49116 1727204711.34056: done sending task result for task 127b8e07-fff9-02f7-957b-00000000007a 49116 1727204711.34059: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 49116 1727204711.34129: no more pending results, returning what we have 49116 1727204711.34133: results queue empty 49116 1727204711.34134: checking for any_errors_fatal 49116 1727204711.34141: done checking for any_errors_fatal 49116 1727204711.34141: checking for max_fail_percentage 49116 1727204711.34143: done checking for max_fail_percentage 49116 1727204711.34144: checking to see if all hosts have failed and the running result is not ok 49116 1727204711.34145: done checking to see if all hosts have failed 49116 1727204711.34146: getting the remaining hosts for this loop 49116 1727204711.34147: done getting the remaining hosts for this loop 49116 1727204711.34151: getting the next task for host managed-node3 49116 1727204711.34163: done getting next task for host managed-node3 49116 1727204711.34167: ^ task is: TASK: meta (role_complete) 49116 1727204711.34170: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204711.34182: getting variables 49116 1727204711.34184: in VariableManager get_vars() 49116 1727204711.34228: Calling all_inventory to load vars for managed-node3 49116 1727204711.34231: Calling groups_inventory to load vars for managed-node3 49116 1727204711.34233: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204711.34245: Calling all_plugins_play to load vars for managed-node3 49116 1727204711.34247: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204711.34250: Calling groups_plugins_play to load vars for managed-node3 49116 1727204711.35844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204711.37115: done with get_vars() 49116 1727204711.37143: done getting variables 49116 1727204711.37217: done queuing things up, now waiting for results queue to drain 49116 1727204711.37219: results queue empty 49116 1727204711.37219: checking for any_errors_fatal 49116 1727204711.37222: done checking for any_errors_fatal 49116 1727204711.37222: checking for max_fail_percentage 49116 1727204711.37223: done checking for max_fail_percentage 49116 1727204711.37223: checking to see if all hosts have failed and the running result is not ok 49116 1727204711.37224: done checking to see if all hosts have failed 49116 1727204711.37224: getting the remaining hosts for this loop 49116 1727204711.37225: done getting the remaining hosts for this loop 49116 1727204711.37227: getting the next task for host managed-node3 49116 1727204711.37230: done getting next task for host managed-node3 49116 1727204711.37233: ^ task is: TASK: Include the task 'manage_test_interface.yml' 49116 1727204711.37234: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204711.37236: getting variables 49116 1727204711.37237: in VariableManager get_vars() 49116 1727204711.37250: Calling all_inventory to load vars for managed-node3 49116 1727204711.37251: Calling groups_inventory to load vars for managed-node3 49116 1727204711.37253: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204711.37257: Calling all_plugins_play to load vars for managed-node3 49116 1727204711.37258: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204711.37260: Calling groups_plugins_play to load vars for managed-node3 49116 1727204711.38256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204711.39462: done with get_vars() 49116 1727204711.39492: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:73 Tuesday 24 September 2024 15:05:11 -0400 (0:00:00.450) 0:00:34.420 ***** 49116 1727204711.39555: entering _queue_task() for managed-node3/include_tasks 49116 1727204711.39908: worker is 1 (out of 1 available) 49116 1727204711.39925: exiting _queue_task() for managed-node3/include_tasks 49116 1727204711.39940: done queuing things up, now waiting for results queue to drain 49116 1727204711.39942: waiting for pending results... 49116 1727204711.40156: running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' 49116 1727204711.40237: in run() - task 127b8e07-fff9-02f7-957b-0000000000aa 49116 1727204711.40251: variable 'ansible_search_path' from source: unknown 49116 1727204711.40372: calling self._execute() 49116 1727204711.40385: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204711.40395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204711.40406: variable 'omit' from source: magic vars 49116 1727204711.40733: variable 'ansible_distribution_major_version' from source: facts 49116 1727204711.40744: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204711.40751: _execute() done 49116 1727204711.40754: dumping result to json 49116 1727204711.40756: done dumping result, returning 49116 1727204711.40764: done running TaskExecutor() for managed-node3/TASK: Include the task 'manage_test_interface.yml' [127b8e07-fff9-02f7-957b-0000000000aa] 49116 1727204711.40770: sending task result for task 127b8e07-fff9-02f7-957b-0000000000aa 49116 1727204711.40916: no more pending results, returning what we have 49116 1727204711.40922: in VariableManager get_vars() 49116 1727204711.40975: Calling all_inventory to load vars for managed-node3 49116 1727204711.40980: Calling groups_inventory to load vars for managed-node3 49116 1727204711.40982: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204711.40998: Calling all_plugins_play to load vars for managed-node3 49116 1727204711.41001: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204711.41004: Calling groups_plugins_play to load vars for managed-node3 49116 1727204711.41586: done sending task result for task 127b8e07-fff9-02f7-957b-0000000000aa 49116 1727204711.41590: WORKER PROCESS EXITING 49116 1727204711.42088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204711.43414: done with get_vars() 49116 1727204711.43434: variable 'ansible_search_path' from source: unknown 49116 1727204711.43448: we have included files to process 49116 1727204711.43449: generating all_blocks data 49116 1727204711.43450: done generating all_blocks data 49116 1727204711.43454: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 49116 1727204711.43455: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 49116 1727204711.43457: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 49116 1727204711.43754: in VariableManager get_vars() 49116 1727204711.43775: done with get_vars() 49116 1727204711.44253: done processing included file 49116 1727204711.44255: iterating over new_blocks loaded from include file 49116 1727204711.44256: in VariableManager get_vars() 49116 1727204711.44274: done with get_vars() 49116 1727204711.44275: filtering new block on tags 49116 1727204711.44298: done filtering new block on tags 49116 1727204711.44300: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node3 49116 1727204711.44304: extending task lists for all hosts with included blocks 49116 1727204711.46271: done extending task lists 49116 1727204711.46273: done processing included files 49116 1727204711.46274: results queue empty 49116 1727204711.46274: checking for any_errors_fatal 49116 1727204711.46276: done checking for any_errors_fatal 49116 1727204711.46276: checking for max_fail_percentage 49116 1727204711.46277: done checking for max_fail_percentage 49116 1727204711.46278: checking to see if all hosts have failed and the running result is not ok 49116 1727204711.46279: done checking to see if all hosts have failed 49116 1727204711.46279: getting the remaining hosts for this loop 49116 1727204711.46280: done getting the remaining hosts for this loop 49116 1727204711.46282: getting the next task for host managed-node3 49116 1727204711.46286: done getting next task for host managed-node3 49116 1727204711.46288: ^ task is: TASK: Ensure state in ["present", "absent"] 49116 1727204711.46290: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204711.46292: getting variables 49116 1727204711.46293: in VariableManager get_vars() 49116 1727204711.46309: Calling all_inventory to load vars for managed-node3 49116 1727204711.46311: Calling groups_inventory to load vars for managed-node3 49116 1727204711.46312: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204711.46319: Calling all_plugins_play to load vars for managed-node3 49116 1727204711.46320: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204711.46322: Calling groups_plugins_play to load vars for managed-node3 49116 1727204711.47197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204711.48539: done with get_vars() 49116 1727204711.48560: done getting variables 49116 1727204711.48603: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 15:05:11 -0400 (0:00:00.090) 0:00:34.511 ***** 49116 1727204711.48629: entering _queue_task() for managed-node3/fail 49116 1727204711.48933: worker is 1 (out of 1 available) 49116 1727204711.48947: exiting _queue_task() for managed-node3/fail 49116 1727204711.48960: done queuing things up, now waiting for results queue to drain 49116 1727204711.48961: waiting for pending results... 49116 1727204711.49164: running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] 49116 1727204711.49249: in run() - task 127b8e07-fff9-02f7-957b-00000000093c 49116 1727204711.49262: variable 'ansible_search_path' from source: unknown 49116 1727204711.49267: variable 'ansible_search_path' from source: unknown 49116 1727204711.49304: calling self._execute() 49116 1727204711.49394: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204711.49398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204711.49413: variable 'omit' from source: magic vars 49116 1727204711.49729: variable 'ansible_distribution_major_version' from source: facts 49116 1727204711.49746: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204711.49852: variable 'state' from source: include params 49116 1727204711.49858: Evaluated conditional (state not in ["present", "absent"]): False 49116 1727204711.49861: when evaluation is False, skipping this task 49116 1727204711.49864: _execute() done 49116 1727204711.49873: dumping result to json 49116 1727204711.49876: done dumping result, returning 49116 1727204711.49879: done running TaskExecutor() for managed-node3/TASK: Ensure state in ["present", "absent"] [127b8e07-fff9-02f7-957b-00000000093c] 49116 1727204711.49881: sending task result for task 127b8e07-fff9-02f7-957b-00000000093c 49116 1727204711.49981: done sending task result for task 127b8e07-fff9-02f7-957b-00000000093c 49116 1727204711.49984: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 49116 1727204711.50038: no more pending results, returning what we have 49116 1727204711.50043: results queue empty 49116 1727204711.50044: checking for any_errors_fatal 49116 1727204711.50046: done checking for any_errors_fatal 49116 1727204711.50047: checking for max_fail_percentage 49116 1727204711.50049: done checking for max_fail_percentage 49116 1727204711.50050: checking to see if all hosts have failed and the running result is not ok 49116 1727204711.50051: done checking to see if all hosts have failed 49116 1727204711.50051: getting the remaining hosts for this loop 49116 1727204711.50053: done getting the remaining hosts for this loop 49116 1727204711.50057: getting the next task for host managed-node3 49116 1727204711.50076: done getting next task for host managed-node3 49116 1727204711.50080: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 49116 1727204711.50085: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204711.50091: getting variables 49116 1727204711.50092: in VariableManager get_vars() 49116 1727204711.50138: Calling all_inventory to load vars for managed-node3 49116 1727204711.50141: Calling groups_inventory to load vars for managed-node3 49116 1727204711.50144: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204711.50157: Calling all_plugins_play to load vars for managed-node3 49116 1727204711.50159: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204711.50162: Calling groups_plugins_play to load vars for managed-node3 49116 1727204711.51209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204711.52422: done with get_vars() 49116 1727204711.52451: done getting variables 49116 1727204711.52507: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 15:05:11 -0400 (0:00:00.039) 0:00:34.550 ***** 49116 1727204711.52534: entering _queue_task() for managed-node3/fail 49116 1727204711.52854: worker is 1 (out of 1 available) 49116 1727204711.52871: exiting _queue_task() for managed-node3/fail 49116 1727204711.52887: done queuing things up, now waiting for results queue to drain 49116 1727204711.52889: waiting for pending results... 49116 1727204711.53093: running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] 49116 1727204711.53177: in run() - task 127b8e07-fff9-02f7-957b-00000000093d 49116 1727204711.53190: variable 'ansible_search_path' from source: unknown 49116 1727204711.53194: variable 'ansible_search_path' from source: unknown 49116 1727204711.53228: calling self._execute() 49116 1727204711.53317: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204711.53321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204711.53336: variable 'omit' from source: magic vars 49116 1727204711.53660: variable 'ansible_distribution_major_version' from source: facts 49116 1727204711.53674: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204711.53787: variable 'type' from source: play vars 49116 1727204711.53791: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 49116 1727204711.53794: when evaluation is False, skipping this task 49116 1727204711.53797: _execute() done 49116 1727204711.53800: dumping result to json 49116 1727204711.53803: done dumping result, returning 49116 1727204711.53809: done running TaskExecutor() for managed-node3/TASK: Ensure type in ["dummy", "tap", "veth"] [127b8e07-fff9-02f7-957b-00000000093d] 49116 1727204711.53815: sending task result for task 127b8e07-fff9-02f7-957b-00000000093d 49116 1727204711.53917: done sending task result for task 127b8e07-fff9-02f7-957b-00000000093d 49116 1727204711.53920: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 49116 1727204711.53983: no more pending results, returning what we have 49116 1727204711.53987: results queue empty 49116 1727204711.53988: checking for any_errors_fatal 49116 1727204711.53994: done checking for any_errors_fatal 49116 1727204711.53994: checking for max_fail_percentage 49116 1727204711.53997: done checking for max_fail_percentage 49116 1727204711.53998: checking to see if all hosts have failed and the running result is not ok 49116 1727204711.53998: done checking to see if all hosts have failed 49116 1727204711.53999: getting the remaining hosts for this loop 49116 1727204711.54001: done getting the remaining hosts for this loop 49116 1727204711.54005: getting the next task for host managed-node3 49116 1727204711.54013: done getting next task for host managed-node3 49116 1727204711.54018: ^ task is: TASK: Include the task 'show_interfaces.yml' 49116 1727204711.54022: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204711.54026: getting variables 49116 1727204711.54028: in VariableManager get_vars() 49116 1727204711.54078: Calling all_inventory to load vars for managed-node3 49116 1727204711.54080: Calling groups_inventory to load vars for managed-node3 49116 1727204711.54082: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204711.54096: Calling all_plugins_play to load vars for managed-node3 49116 1727204711.54098: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204711.54101: Calling groups_plugins_play to load vars for managed-node3 49116 1727204711.55961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204711.57163: done with get_vars() 49116 1727204711.57193: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 15:05:11 -0400 (0:00:00.047) 0:00:34.597 ***** 49116 1727204711.57281: entering _queue_task() for managed-node3/include_tasks 49116 1727204711.57610: worker is 1 (out of 1 available) 49116 1727204711.57625: exiting _queue_task() for managed-node3/include_tasks 49116 1727204711.57640: done queuing things up, now waiting for results queue to drain 49116 1727204711.57642: waiting for pending results... 49116 1727204711.58090: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 49116 1727204711.58096: in run() - task 127b8e07-fff9-02f7-957b-00000000093e 49116 1727204711.58122: variable 'ansible_search_path' from source: unknown 49116 1727204711.58130: variable 'ansible_search_path' from source: unknown 49116 1727204711.58184: calling self._execute() 49116 1727204711.58301: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204711.58314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204711.58401: variable 'omit' from source: magic vars 49116 1727204711.58764: variable 'ansible_distribution_major_version' from source: facts 49116 1727204711.58786: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204711.58798: _execute() done 49116 1727204711.58806: dumping result to json 49116 1727204711.58813: done dumping result, returning 49116 1727204711.58824: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [127b8e07-fff9-02f7-957b-00000000093e] 49116 1727204711.58838: sending task result for task 127b8e07-fff9-02f7-957b-00000000093e 49116 1727204711.59189: done sending task result for task 127b8e07-fff9-02f7-957b-00000000093e 49116 1727204711.59193: WORKER PROCESS EXITING 49116 1727204711.59221: no more pending results, returning what we have 49116 1727204711.59226: in VariableManager get_vars() 49116 1727204711.59278: Calling all_inventory to load vars for managed-node3 49116 1727204711.59281: Calling groups_inventory to load vars for managed-node3 49116 1727204711.59284: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204711.59298: Calling all_plugins_play to load vars for managed-node3 49116 1727204711.59301: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204711.59305: Calling groups_plugins_play to load vars for managed-node3 49116 1727204711.61134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204711.63459: done with get_vars() 49116 1727204711.63487: variable 'ansible_search_path' from source: unknown 49116 1727204711.63489: variable 'ansible_search_path' from source: unknown 49116 1727204711.63534: we have included files to process 49116 1727204711.63535: generating all_blocks data 49116 1727204711.63537: done generating all_blocks data 49116 1727204711.63541: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 49116 1727204711.63542: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 49116 1727204711.63545: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 49116 1727204711.63658: in VariableManager get_vars() 49116 1727204711.63687: done with get_vars() 49116 1727204711.63811: done processing included file 49116 1727204711.63813: iterating over new_blocks loaded from include file 49116 1727204711.63815: in VariableManager get_vars() 49116 1727204711.63835: done with get_vars() 49116 1727204711.63837: filtering new block on tags 49116 1727204711.63857: done filtering new block on tags 49116 1727204711.63860: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 49116 1727204711.63867: extending task lists for all hosts with included blocks 49116 1727204711.64312: done extending task lists 49116 1727204711.64314: done processing included files 49116 1727204711.64315: results queue empty 49116 1727204711.64315: checking for any_errors_fatal 49116 1727204711.64319: done checking for any_errors_fatal 49116 1727204711.64320: checking for max_fail_percentage 49116 1727204711.64321: done checking for max_fail_percentage 49116 1727204711.64322: checking to see if all hosts have failed and the running result is not ok 49116 1727204711.64323: done checking to see if all hosts have failed 49116 1727204711.64324: getting the remaining hosts for this loop 49116 1727204711.64325: done getting the remaining hosts for this loop 49116 1727204711.64328: getting the next task for host managed-node3 49116 1727204711.64332: done getting next task for host managed-node3 49116 1727204711.64334: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 49116 1727204711.64338: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204711.64341: getting variables 49116 1727204711.64342: in VariableManager get_vars() 49116 1727204711.64358: Calling all_inventory to load vars for managed-node3 49116 1727204711.64360: Calling groups_inventory to load vars for managed-node3 49116 1727204711.64362: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204711.64370: Calling all_plugins_play to load vars for managed-node3 49116 1727204711.64373: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204711.64376: Calling groups_plugins_play to load vars for managed-node3 49116 1727204711.65946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204711.68150: done with get_vars() 49116 1727204711.68192: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:05:11 -0400 (0:00:00.110) 0:00:34.707 ***** 49116 1727204711.68286: entering _queue_task() for managed-node3/include_tasks 49116 1727204711.68700: worker is 1 (out of 1 available) 49116 1727204711.68713: exiting _queue_task() for managed-node3/include_tasks 49116 1727204711.68725: done queuing things up, now waiting for results queue to drain 49116 1727204711.68726: waiting for pending results... 49116 1727204711.69057: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 49116 1727204711.69225: in run() - task 127b8e07-fff9-02f7-957b-000000000aa0 49116 1727204711.69250: variable 'ansible_search_path' from source: unknown 49116 1727204711.69258: variable 'ansible_search_path' from source: unknown 49116 1727204711.69316: calling self._execute() 49116 1727204711.69442: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204711.69455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204711.69475: variable 'omit' from source: magic vars 49116 1727204711.69995: variable 'ansible_distribution_major_version' from source: facts 49116 1727204711.70024: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204711.70036: _execute() done 49116 1727204711.70045: dumping result to json 49116 1727204711.70054: done dumping result, returning 49116 1727204711.70067: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [127b8e07-fff9-02f7-957b-000000000aa0] 49116 1727204711.70081: sending task result for task 127b8e07-fff9-02f7-957b-000000000aa0 49116 1727204711.70401: no more pending results, returning what we have 49116 1727204711.70408: in VariableManager get_vars() 49116 1727204711.70469: Calling all_inventory to load vars for managed-node3 49116 1727204711.70473: Calling groups_inventory to load vars for managed-node3 49116 1727204711.70475: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204711.70495: Calling all_plugins_play to load vars for managed-node3 49116 1727204711.70499: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204711.70503: Calling groups_plugins_play to load vars for managed-node3 49116 1727204711.71210: done sending task result for task 127b8e07-fff9-02f7-957b-000000000aa0 49116 1727204711.71215: WORKER PROCESS EXITING 49116 1727204711.72907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204711.75272: done with get_vars() 49116 1727204711.75309: variable 'ansible_search_path' from source: unknown 49116 1727204711.75310: variable 'ansible_search_path' from source: unknown 49116 1727204711.75391: we have included files to process 49116 1727204711.75393: generating all_blocks data 49116 1727204711.75394: done generating all_blocks data 49116 1727204711.75395: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 49116 1727204711.75396: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 49116 1727204711.75399: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 49116 1727204711.75726: done processing included file 49116 1727204711.75728: iterating over new_blocks loaded from include file 49116 1727204711.75730: in VariableManager get_vars() 49116 1727204711.75763: done with get_vars() 49116 1727204711.75767: filtering new block on tags 49116 1727204711.75790: done filtering new block on tags 49116 1727204711.75793: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 49116 1727204711.75799: extending task lists for all hosts with included blocks 49116 1727204711.75988: done extending task lists 49116 1727204711.75990: done processing included files 49116 1727204711.75991: results queue empty 49116 1727204711.75991: checking for any_errors_fatal 49116 1727204711.75995: done checking for any_errors_fatal 49116 1727204711.75996: checking for max_fail_percentage 49116 1727204711.75997: done checking for max_fail_percentage 49116 1727204711.75998: checking to see if all hosts have failed and the running result is not ok 49116 1727204711.75999: done checking to see if all hosts have failed 49116 1727204711.76000: getting the remaining hosts for this loop 49116 1727204711.76001: done getting the remaining hosts for this loop 49116 1727204711.76003: getting the next task for host managed-node3 49116 1727204711.76008: done getting next task for host managed-node3 49116 1727204711.76010: ^ task is: TASK: Gather current interface info 49116 1727204711.76014: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204711.76017: getting variables 49116 1727204711.76019: in VariableManager get_vars() 49116 1727204711.76037: Calling all_inventory to load vars for managed-node3 49116 1727204711.76040: Calling groups_inventory to load vars for managed-node3 49116 1727204711.76042: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204711.76048: Calling all_plugins_play to load vars for managed-node3 49116 1727204711.76051: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204711.76054: Calling groups_plugins_play to load vars for managed-node3 49116 1727204711.77877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204711.80189: done with get_vars() 49116 1727204711.80221: done getting variables 49116 1727204711.80278: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:05:11 -0400 (0:00:00.120) 0:00:34.828 ***** 49116 1727204711.80315: entering _queue_task() for managed-node3/command 49116 1727204711.80960: worker is 1 (out of 1 available) 49116 1727204711.80978: exiting _queue_task() for managed-node3/command 49116 1727204711.80994: done queuing things up, now waiting for results queue to drain 49116 1727204711.80995: waiting for pending results... 49116 1727204711.81539: running TaskExecutor() for managed-node3/TASK: Gather current interface info 49116 1727204711.81738: in run() - task 127b8e07-fff9-02f7-957b-000000000ad7 49116 1727204711.81777: variable 'ansible_search_path' from source: unknown 49116 1727204711.81821: variable 'ansible_search_path' from source: unknown 49116 1727204711.81836: calling self._execute() 49116 1727204711.81950: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204711.81962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204711.81983: variable 'omit' from source: magic vars 49116 1727204711.82410: variable 'ansible_distribution_major_version' from source: facts 49116 1727204711.82473: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204711.82476: variable 'omit' from source: magic vars 49116 1727204711.82513: variable 'omit' from source: magic vars 49116 1727204711.82556: variable 'omit' from source: magic vars 49116 1727204711.82616: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204711.82663: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204711.82696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204711.82800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204711.82804: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204711.82806: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204711.82809: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204711.82811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204711.82900: Set connection var ansible_connection to ssh 49116 1727204711.82924: Set connection var ansible_timeout to 10 49116 1727204711.82937: Set connection var ansible_shell_executable to /bin/sh 49116 1727204711.82946: Set connection var ansible_pipelining to False 49116 1727204711.82953: Set connection var ansible_shell_type to sh 49116 1727204711.82962: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204711.82993: variable 'ansible_shell_executable' from source: unknown 49116 1727204711.83001: variable 'ansible_connection' from source: unknown 49116 1727204711.83008: variable 'ansible_module_compression' from source: unknown 49116 1727204711.83071: variable 'ansible_shell_type' from source: unknown 49116 1727204711.83074: variable 'ansible_shell_executable' from source: unknown 49116 1727204711.83077: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204711.83079: variable 'ansible_pipelining' from source: unknown 49116 1727204711.83080: variable 'ansible_timeout' from source: unknown 49116 1727204711.83083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204711.83235: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204711.83240: variable 'omit' from source: magic vars 49116 1727204711.83246: starting attempt loop 49116 1727204711.83272: running the handler 49116 1727204711.83281: _low_level_execute_command(): starting 49116 1727204711.83294: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204711.84976: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204711.85236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204711.85320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204711.87502: stdout chunk (state=3): >>>/root <<< 49116 1727204711.87736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204711.87740: stdout chunk (state=3): >>><<< 49116 1727204711.87742: stderr chunk (state=3): >>><<< 49116 1727204711.87770: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204711.87798: _low_level_execute_command(): starting 49116 1727204711.88074: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204711.8777738-50990-74971788663048 `" && echo ansible-tmp-1727204711.8777738-50990-74971788663048="` echo /root/.ansible/tmp/ansible-tmp-1727204711.8777738-50990-74971788663048 `" ) && sleep 0' 49116 1727204711.89459: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204711.89464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49116 1727204711.89470: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204711.89481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204711.89646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204711.89972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204711.89975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204711.92061: stdout chunk (state=3): >>>ansible-tmp-1727204711.8777738-50990-74971788663048=/root/.ansible/tmp/ansible-tmp-1727204711.8777738-50990-74971788663048 <<< 49116 1727204711.92276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204711.92280: stdout chunk (state=3): >>><<< 49116 1727204711.92473: stderr chunk (state=3): >>><<< 49116 1727204711.92478: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204711.8777738-50990-74971788663048=/root/.ansible/tmp/ansible-tmp-1727204711.8777738-50990-74971788663048 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204711.92480: variable 'ansible_module_compression' from source: unknown 49116 1727204711.92520: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 49116 1727204711.92568: variable 'ansible_facts' from source: unknown 49116 1727204711.92857: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204711.8777738-50990-74971788663048/AnsiballZ_command.py 49116 1727204711.93194: Sending initial data 49116 1727204711.93198: Sent initial data (155 bytes) 49116 1727204711.95084: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204711.95090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204711.95093: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204711.95095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204711.95098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204711.97062: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204711.97194: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204711.97282: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmp9114m04u /root/.ansible/tmp/ansible-tmp-1727204711.8777738-50990-74971788663048/AnsiballZ_command.py <<< 49116 1727204711.97286: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204711.8777738-50990-74971788663048/AnsiballZ_command.py" <<< 49116 1727204711.97393: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmp9114m04u" to remote "/root/.ansible/tmp/ansible-tmp-1727204711.8777738-50990-74971788663048/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204711.8777738-50990-74971788663048/AnsiballZ_command.py" <<< 49116 1727204711.99013: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204711.99018: stdout chunk (state=3): >>><<< 49116 1727204711.99021: stderr chunk (state=3): >>><<< 49116 1727204711.99052: done transferring module to remote 49116 1727204711.99063: _low_level_execute_command(): starting 49116 1727204711.99070: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204711.8777738-50990-74971788663048/ /root/.ansible/tmp/ansible-tmp-1727204711.8777738-50990-74971788663048/AnsiballZ_command.py && sleep 0' 49116 1727204712.00689: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204712.00694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204712.00716: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204712.00898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204712.00994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204712.03027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204712.03119: stderr chunk (state=3): >>><<< 49116 1727204712.03123: stdout chunk (state=3): >>><<< 49116 1727204712.03144: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204712.03148: _low_level_execute_command(): starting 49116 1727204712.03154: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204711.8777738-50990-74971788663048/AnsiballZ_command.py && sleep 0' 49116 1727204712.03791: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204712.03798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204712.03810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204712.03824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204712.03838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204712.03845: stderr chunk (state=3): >>>debug2: match not found <<< 49116 1727204712.03855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204712.03870: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 49116 1727204712.03878: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address <<< 49116 1727204712.03884: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 49116 1727204712.03892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204712.03903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204712.03916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204712.03922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204712.03930: stderr chunk (state=3): >>>debug2: match found <<< 49116 1727204712.03942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204712.04013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204712.04027: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204712.04048: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204712.04156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204712.22169: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nlsr101\npeerlsr101", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:05:12.216638", "end": "2024-09-24 15:05:12.220477", "delta": "0:00:00.003839", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49116 1727204712.24146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204712.24155: stdout chunk (state=3): >>><<< 49116 1727204712.24158: stderr chunk (state=3): >>><<< 49116 1727204712.24162: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nlsr101\npeerlsr101", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:05:12.216638", "end": "2024-09-24 15:05:12.220477", "delta": "0:00:00.003839", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204712.24174: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204711.8777738-50990-74971788663048/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204712.24178: _low_level_execute_command(): starting 49116 1727204712.24180: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204711.8777738-50990-74971788663048/ > /dev/null 2>&1 && sleep 0' 49116 1727204712.24878: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204712.24945: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204712.24986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204712.25008: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204712.25117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204712.27180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204712.27253: stderr chunk (state=3): >>><<< 49116 1727204712.27257: stdout chunk (state=3): >>><<< 49116 1727204712.27275: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204712.27312: handler run complete 49116 1727204712.27437: Evaluated conditional (False): False 49116 1727204712.27440: attempt loop complete, returning result 49116 1727204712.27443: _execute() done 49116 1727204712.27445: dumping result to json 49116 1727204712.27447: done dumping result, returning 49116 1727204712.27449: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [127b8e07-fff9-02f7-957b-000000000ad7] 49116 1727204712.27451: sending task result for task 127b8e07-fff9-02f7-957b-000000000ad7 49116 1727204712.27523: done sending task result for task 127b8e07-fff9-02f7-957b-000000000ad7 49116 1727204712.27527: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003839", "end": "2024-09-24 15:05:12.220477", "rc": 0, "start": "2024-09-24 15:05:12.216638" } STDOUT: bonding_masters eth0 lo lsr101 peerlsr101 49116 1727204712.27609: no more pending results, returning what we have 49116 1727204712.27613: results queue empty 49116 1727204712.27614: checking for any_errors_fatal 49116 1727204712.27617: done checking for any_errors_fatal 49116 1727204712.27618: checking for max_fail_percentage 49116 1727204712.27620: done checking for max_fail_percentage 49116 1727204712.27621: checking to see if all hosts have failed and the running result is not ok 49116 1727204712.27622: done checking to see if all hosts have failed 49116 1727204712.27622: getting the remaining hosts for this loop 49116 1727204712.27624: done getting the remaining hosts for this loop 49116 1727204712.27629: getting the next task for host managed-node3 49116 1727204712.27638: done getting next task for host managed-node3 49116 1727204712.27641: ^ task is: TASK: Set current_interfaces 49116 1727204712.27646: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204712.27652: getting variables 49116 1727204712.27653: in VariableManager get_vars() 49116 1727204712.27805: Calling all_inventory to load vars for managed-node3 49116 1727204712.27809: Calling groups_inventory to load vars for managed-node3 49116 1727204712.27811: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204712.27823: Calling all_plugins_play to load vars for managed-node3 49116 1727204712.27826: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204712.27828: Calling groups_plugins_play to load vars for managed-node3 49116 1727204712.29251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204712.30459: done with get_vars() 49116 1727204712.30491: done getting variables 49116 1727204712.30543: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:05:12 -0400 (0:00:00.502) 0:00:35.330 ***** 49116 1727204712.30577: entering _queue_task() for managed-node3/set_fact 49116 1727204712.30875: worker is 1 (out of 1 available) 49116 1727204712.30891: exiting _queue_task() for managed-node3/set_fact 49116 1727204712.30909: done queuing things up, now waiting for results queue to drain 49116 1727204712.30910: waiting for pending results... 49116 1727204712.31116: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 49116 1727204712.31209: in run() - task 127b8e07-fff9-02f7-957b-000000000ad8 49116 1727204712.31224: variable 'ansible_search_path' from source: unknown 49116 1727204712.31228: variable 'ansible_search_path' from source: unknown 49116 1727204712.31266: calling self._execute() 49116 1727204712.31355: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204712.31370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204712.31383: variable 'omit' from source: magic vars 49116 1727204712.31705: variable 'ansible_distribution_major_version' from source: facts 49116 1727204712.31716: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204712.31722: variable 'omit' from source: magic vars 49116 1727204712.31764: variable 'omit' from source: magic vars 49116 1727204712.31853: variable '_current_interfaces' from source: set_fact 49116 1727204712.31911: variable 'omit' from source: magic vars 49116 1727204712.31948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204712.31981: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204712.31997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204712.32017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204712.32027: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204712.32054: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204712.32057: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204712.32060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204712.32138: Set connection var ansible_connection to ssh 49116 1727204712.32148: Set connection var ansible_timeout to 10 49116 1727204712.32155: Set connection var ansible_shell_executable to /bin/sh 49116 1727204712.32161: Set connection var ansible_pipelining to False 49116 1727204712.32163: Set connection var ansible_shell_type to sh 49116 1727204712.32171: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204712.32190: variable 'ansible_shell_executable' from source: unknown 49116 1727204712.32193: variable 'ansible_connection' from source: unknown 49116 1727204712.32196: variable 'ansible_module_compression' from source: unknown 49116 1727204712.32198: variable 'ansible_shell_type' from source: unknown 49116 1727204712.32201: variable 'ansible_shell_executable' from source: unknown 49116 1727204712.32203: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204712.32207: variable 'ansible_pipelining' from source: unknown 49116 1727204712.32210: variable 'ansible_timeout' from source: unknown 49116 1727204712.32214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204712.32329: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204712.32341: variable 'omit' from source: magic vars 49116 1727204712.32349: starting attempt loop 49116 1727204712.32353: running the handler 49116 1727204712.32362: handler run complete 49116 1727204712.32373: attempt loop complete, returning result 49116 1727204712.32376: _execute() done 49116 1727204712.32379: dumping result to json 49116 1727204712.32381: done dumping result, returning 49116 1727204712.32389: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [127b8e07-fff9-02f7-957b-000000000ad8] 49116 1727204712.32394: sending task result for task 127b8e07-fff9-02f7-957b-000000000ad8 49116 1727204712.32487: done sending task result for task 127b8e07-fff9-02f7-957b-000000000ad8 49116 1727204712.32490: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "lsr101", "peerlsr101" ] }, "changed": false } 49116 1727204712.32553: no more pending results, returning what we have 49116 1727204712.32557: results queue empty 49116 1727204712.32558: checking for any_errors_fatal 49116 1727204712.32567: done checking for any_errors_fatal 49116 1727204712.32568: checking for max_fail_percentage 49116 1727204712.32570: done checking for max_fail_percentage 49116 1727204712.32571: checking to see if all hosts have failed and the running result is not ok 49116 1727204712.32572: done checking to see if all hosts have failed 49116 1727204712.32573: getting the remaining hosts for this loop 49116 1727204712.32574: done getting the remaining hosts for this loop 49116 1727204712.32579: getting the next task for host managed-node3 49116 1727204712.32588: done getting next task for host managed-node3 49116 1727204712.32591: ^ task is: TASK: Show current_interfaces 49116 1727204712.32595: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204712.32599: getting variables 49116 1727204712.32601: in VariableManager get_vars() 49116 1727204712.32646: Calling all_inventory to load vars for managed-node3 49116 1727204712.32649: Calling groups_inventory to load vars for managed-node3 49116 1727204712.32651: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204712.32662: Calling all_plugins_play to load vars for managed-node3 49116 1727204712.32672: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204712.32676: Calling groups_plugins_play to load vars for managed-node3 49116 1727204712.33829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204712.35032: done with get_vars() 49116 1727204712.35061: done getting variables 49116 1727204712.35113: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:05:12 -0400 (0:00:00.045) 0:00:35.376 ***** 49116 1727204712.35142: entering _queue_task() for managed-node3/debug 49116 1727204712.35439: worker is 1 (out of 1 available) 49116 1727204712.35456: exiting _queue_task() for managed-node3/debug 49116 1727204712.35471: done queuing things up, now waiting for results queue to drain 49116 1727204712.35472: waiting for pending results... 49116 1727204712.35669: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 49116 1727204712.35756: in run() - task 127b8e07-fff9-02f7-957b-000000000aa1 49116 1727204712.35770: variable 'ansible_search_path' from source: unknown 49116 1727204712.35774: variable 'ansible_search_path' from source: unknown 49116 1727204712.35810: calling self._execute() 49116 1727204712.35895: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204712.35899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204712.35911: variable 'omit' from source: magic vars 49116 1727204712.36219: variable 'ansible_distribution_major_version' from source: facts 49116 1727204712.36229: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204712.36238: variable 'omit' from source: magic vars 49116 1727204712.36276: variable 'omit' from source: magic vars 49116 1727204712.36354: variable 'current_interfaces' from source: set_fact 49116 1727204712.36380: variable 'omit' from source: magic vars 49116 1727204712.36419: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204712.36452: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204712.36470: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204712.36488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204712.36500: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204712.36524: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204712.36527: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204712.36530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204712.36610: Set connection var ansible_connection to ssh 49116 1727204712.36621: Set connection var ansible_timeout to 10 49116 1727204712.36628: Set connection var ansible_shell_executable to /bin/sh 49116 1727204712.36636: Set connection var ansible_pipelining to False 49116 1727204712.36639: Set connection var ansible_shell_type to sh 49116 1727204712.36641: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204712.36660: variable 'ansible_shell_executable' from source: unknown 49116 1727204712.36664: variable 'ansible_connection' from source: unknown 49116 1727204712.36668: variable 'ansible_module_compression' from source: unknown 49116 1727204712.36671: variable 'ansible_shell_type' from source: unknown 49116 1727204712.36673: variable 'ansible_shell_executable' from source: unknown 49116 1727204712.36675: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204712.36680: variable 'ansible_pipelining' from source: unknown 49116 1727204712.36682: variable 'ansible_timeout' from source: unknown 49116 1727204712.36689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204712.36801: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204712.36813: variable 'omit' from source: magic vars 49116 1727204712.36817: starting attempt loop 49116 1727204712.36820: running the handler 49116 1727204712.36862: handler run complete 49116 1727204712.36876: attempt loop complete, returning result 49116 1727204712.36879: _execute() done 49116 1727204712.36882: dumping result to json 49116 1727204712.36884: done dumping result, returning 49116 1727204712.36891: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [127b8e07-fff9-02f7-957b-000000000aa1] 49116 1727204712.36897: sending task result for task 127b8e07-fff9-02f7-957b-000000000aa1 49116 1727204712.36993: done sending task result for task 127b8e07-fff9-02f7-957b-000000000aa1 49116 1727204712.36996: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'lsr101', 'peerlsr101'] 49116 1727204712.37062: no more pending results, returning what we have 49116 1727204712.37070: results queue empty 49116 1727204712.37071: checking for any_errors_fatal 49116 1727204712.37079: done checking for any_errors_fatal 49116 1727204712.37080: checking for max_fail_percentage 49116 1727204712.37081: done checking for max_fail_percentage 49116 1727204712.37083: checking to see if all hosts have failed and the running result is not ok 49116 1727204712.37083: done checking to see if all hosts have failed 49116 1727204712.37084: getting the remaining hosts for this loop 49116 1727204712.37085: done getting the remaining hosts for this loop 49116 1727204712.37090: getting the next task for host managed-node3 49116 1727204712.37099: done getting next task for host managed-node3 49116 1727204712.37102: ^ task is: TASK: Install iproute 49116 1727204712.37105: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204712.37109: getting variables 49116 1727204712.37110: in VariableManager get_vars() 49116 1727204712.37155: Calling all_inventory to load vars for managed-node3 49116 1727204712.37158: Calling groups_inventory to load vars for managed-node3 49116 1727204712.37160: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204712.37178: Calling all_plugins_play to load vars for managed-node3 49116 1727204712.37181: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204712.37184: Calling groups_plugins_play to load vars for managed-node3 49116 1727204712.42614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204712.43817: done with get_vars() 49116 1727204712.43849: done getting variables 49116 1727204712.43893: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 15:05:12 -0400 (0:00:00.087) 0:00:35.464 ***** 49116 1727204712.43917: entering _queue_task() for managed-node3/package 49116 1727204712.44221: worker is 1 (out of 1 available) 49116 1727204712.44241: exiting _queue_task() for managed-node3/package 49116 1727204712.44255: done queuing things up, now waiting for results queue to drain 49116 1727204712.44258: waiting for pending results... 49116 1727204712.44457: running TaskExecutor() for managed-node3/TASK: Install iproute 49116 1727204712.44552: in run() - task 127b8e07-fff9-02f7-957b-00000000093f 49116 1727204712.44564: variable 'ansible_search_path' from source: unknown 49116 1727204712.44570: variable 'ansible_search_path' from source: unknown 49116 1727204712.44609: calling self._execute() 49116 1727204712.44694: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204712.44700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204712.44716: variable 'omit' from source: magic vars 49116 1727204712.45041: variable 'ansible_distribution_major_version' from source: facts 49116 1727204712.45052: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204712.45056: variable 'omit' from source: magic vars 49116 1727204712.45090: variable 'omit' from source: magic vars 49116 1727204712.45302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49116 1727204712.47174: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49116 1727204712.47178: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49116 1727204712.47206: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49116 1727204712.47250: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49116 1727204712.47286: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49116 1727204712.47397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49116 1727204712.47436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49116 1727204712.47471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49116 1727204712.47522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49116 1727204712.47542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49116 1727204712.47677: variable '__network_is_ostree' from source: set_fact 49116 1727204712.47688: variable 'omit' from source: magic vars 49116 1727204712.47728: variable 'omit' from source: magic vars 49116 1727204712.47763: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204712.47836: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204712.47840: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204712.47843: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204712.47846: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204712.47871: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204712.47875: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204712.47877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204712.47953: Set connection var ansible_connection to ssh 49116 1727204712.47964: Set connection var ansible_timeout to 10 49116 1727204712.47973: Set connection var ansible_shell_executable to /bin/sh 49116 1727204712.47978: Set connection var ansible_pipelining to False 49116 1727204712.47981: Set connection var ansible_shell_type to sh 49116 1727204712.47990: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204712.48012: variable 'ansible_shell_executable' from source: unknown 49116 1727204712.48016: variable 'ansible_connection' from source: unknown 49116 1727204712.48019: variable 'ansible_module_compression' from source: unknown 49116 1727204712.48021: variable 'ansible_shell_type' from source: unknown 49116 1727204712.48024: variable 'ansible_shell_executable' from source: unknown 49116 1727204712.48026: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204712.48038: variable 'ansible_pipelining' from source: unknown 49116 1727204712.48046: variable 'ansible_timeout' from source: unknown 49116 1727204712.48048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204712.48121: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204712.48132: variable 'omit' from source: magic vars 49116 1727204712.48137: starting attempt loop 49116 1727204712.48140: running the handler 49116 1727204712.48149: variable 'ansible_facts' from source: unknown 49116 1727204712.48152: variable 'ansible_facts' from source: unknown 49116 1727204712.48184: _low_level_execute_command(): starting 49116 1727204712.48190: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204712.48736: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204712.48741: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204712.48744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204712.48802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204712.48808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204712.48817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204712.48890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204712.50770: stdout chunk (state=3): >>>/root <<< 49116 1727204712.51079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204712.51083: stdout chunk (state=3): >>><<< 49116 1727204712.51086: stderr chunk (state=3): >>><<< 49116 1727204712.51090: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204712.51101: _low_level_execute_command(): starting 49116 1727204712.51105: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204712.5099087-51023-258488258201259 `" && echo ansible-tmp-1727204712.5099087-51023-258488258201259="` echo /root/.ansible/tmp/ansible-tmp-1727204712.5099087-51023-258488258201259 `" ) && sleep 0' 49116 1727204712.51753: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204712.51851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204712.51892: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204712.51909: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204712.51931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204712.52043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204712.54253: stdout chunk (state=3): >>>ansible-tmp-1727204712.5099087-51023-258488258201259=/root/.ansible/tmp/ansible-tmp-1727204712.5099087-51023-258488258201259 <<< 49116 1727204712.54475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204712.54494: stdout chunk (state=3): >>><<< 49116 1727204712.54507: stderr chunk (state=3): >>><<< 49116 1727204712.54533: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204712.5099087-51023-258488258201259=/root/.ansible/tmp/ansible-tmp-1727204712.5099087-51023-258488258201259 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204712.54594: variable 'ansible_module_compression' from source: unknown 49116 1727204712.54871: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 49116 1727204712.54874: variable 'ansible_facts' from source: unknown 49116 1727204712.54876: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204712.5099087-51023-258488258201259/AnsiballZ_dnf.py 49116 1727204712.55306: Sending initial data 49116 1727204712.55406: Sent initial data (152 bytes) 49116 1727204712.56025: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204712.56048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204712.56079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found <<< 49116 1727204712.56092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204712.56190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204712.56216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204712.56324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204712.58159: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204712.58267: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204712.58347: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpvsmakh9y /root/.ansible/tmp/ansible-tmp-1727204712.5099087-51023-258488258201259/AnsiballZ_dnf.py <<< 49116 1727204712.58350: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204712.5099087-51023-258488258201259/AnsiballZ_dnf.py" <<< 49116 1727204712.58401: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpvsmakh9y" to remote "/root/.ansible/tmp/ansible-tmp-1727204712.5099087-51023-258488258201259/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204712.5099087-51023-258488258201259/AnsiballZ_dnf.py" <<< 49116 1727204712.59844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204712.59849: stdout chunk (state=3): >>><<< 49116 1727204712.59852: stderr chunk (state=3): >>><<< 49116 1727204712.59854: done transferring module to remote 49116 1727204712.59857: _low_level_execute_command(): starting 49116 1727204712.59860: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204712.5099087-51023-258488258201259/ /root/.ansible/tmp/ansible-tmp-1727204712.5099087-51023-258488258201259/AnsiballZ_dnf.py && sleep 0' 49116 1727204712.60714: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204712.60830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204712.60856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204712.60971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204712.63081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204712.63086: stderr chunk (state=3): >>><<< 49116 1727204712.63089: stdout chunk (state=3): >>><<< 49116 1727204712.63116: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204712.63120: _low_level_execute_command(): starting 49116 1727204712.63123: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204712.5099087-51023-258488258201259/AnsiballZ_dnf.py && sleep 0' 49116 1727204712.63959: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204712.63974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204712.63981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204712.64103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204713.85748: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 49116 1727204713.91153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204713.91213: stderr chunk (state=3): >>><<< 49116 1727204713.91217: stdout chunk (state=3): >>><<< 49116 1727204713.91234: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204713.91281: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204712.5099087-51023-258488258201259/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204713.91289: _low_level_execute_command(): starting 49116 1727204713.91293: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204712.5099087-51023-258488258201259/ > /dev/null 2>&1 && sleep 0' 49116 1727204713.91773: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204713.91790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204713.91794: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204713.91852: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204713.91855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204713.91932: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204713.93992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204713.94057: stderr chunk (state=3): >>><<< 49116 1727204713.94061: stdout chunk (state=3): >>><<< 49116 1727204713.94078: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204713.94086: handler run complete 49116 1727204713.94214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49116 1727204713.94371: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49116 1727204713.94405: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49116 1727204713.94431: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49116 1727204713.94457: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49116 1727204713.94515: variable '__install_status' from source: set_fact 49116 1727204713.94531: Evaluated conditional (__install_status is success): True 49116 1727204713.94547: attempt loop complete, returning result 49116 1727204713.94550: _execute() done 49116 1727204713.94553: dumping result to json 49116 1727204713.94560: done dumping result, returning 49116 1727204713.94568: done running TaskExecutor() for managed-node3/TASK: Install iproute [127b8e07-fff9-02f7-957b-00000000093f] 49116 1727204713.94573: sending task result for task 127b8e07-fff9-02f7-957b-00000000093f 49116 1727204713.94683: done sending task result for task 127b8e07-fff9-02f7-957b-00000000093f 49116 1727204713.94685: WORKER PROCESS EXITING ok: [managed-node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 49116 1727204713.94782: no more pending results, returning what we have 49116 1727204713.94785: results queue empty 49116 1727204713.94787: checking for any_errors_fatal 49116 1727204713.94800: done checking for any_errors_fatal 49116 1727204713.94801: checking for max_fail_percentage 49116 1727204713.94803: done checking for max_fail_percentage 49116 1727204713.94804: checking to see if all hosts have failed and the running result is not ok 49116 1727204713.94805: done checking to see if all hosts have failed 49116 1727204713.94806: getting the remaining hosts for this loop 49116 1727204713.94807: done getting the remaining hosts for this loop 49116 1727204713.94812: getting the next task for host managed-node3 49116 1727204713.94819: done getting next task for host managed-node3 49116 1727204713.94822: ^ task is: TASK: Create veth interface {{ interface }} 49116 1727204713.94824: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204713.94829: getting variables 49116 1727204713.94830: in VariableManager get_vars() 49116 1727204713.94875: Calling all_inventory to load vars for managed-node3 49116 1727204713.94878: Calling groups_inventory to load vars for managed-node3 49116 1727204713.94881: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204713.94892: Calling all_plugins_play to load vars for managed-node3 49116 1727204713.94894: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204713.94897: Calling groups_plugins_play to load vars for managed-node3 49116 1727204713.95945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204713.97187: done with get_vars() 49116 1727204713.97215: done getting variables 49116 1727204713.97271: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204713.97373: variable 'interface' from source: play vars TASK [Create veth interface lsr101] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 15:05:13 -0400 (0:00:01.534) 0:00:36.998 ***** 49116 1727204713.97399: entering _queue_task() for managed-node3/command 49116 1727204713.97699: worker is 1 (out of 1 available) 49116 1727204713.97714: exiting _queue_task() for managed-node3/command 49116 1727204713.97728: done queuing things up, now waiting for results queue to drain 49116 1727204713.97729: waiting for pending results... 49116 1727204713.97936: running TaskExecutor() for managed-node3/TASK: Create veth interface lsr101 49116 1727204713.98022: in run() - task 127b8e07-fff9-02f7-957b-000000000940 49116 1727204713.98037: variable 'ansible_search_path' from source: unknown 49116 1727204713.98041: variable 'ansible_search_path' from source: unknown 49116 1727204713.98289: variable 'interface' from source: play vars 49116 1727204713.98355: variable 'interface' from source: play vars 49116 1727204713.98414: variable 'interface' from source: play vars 49116 1727204713.98552: Loaded config def from plugin (lookup/items) 49116 1727204713.98560: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 49116 1727204713.98582: variable 'omit' from source: magic vars 49116 1727204713.98692: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204713.98702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204713.98713: variable 'omit' from source: magic vars 49116 1727204713.98901: variable 'ansible_distribution_major_version' from source: facts 49116 1727204713.98908: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204713.99056: variable 'type' from source: play vars 49116 1727204713.99060: variable 'state' from source: include params 49116 1727204713.99063: variable 'interface' from source: play vars 49116 1727204713.99067: variable 'current_interfaces' from source: set_fact 49116 1727204713.99076: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 49116 1727204713.99079: when evaluation is False, skipping this task 49116 1727204713.99104: variable 'item' from source: unknown 49116 1727204713.99155: variable 'item' from source: unknown skipping: [managed-node3] => (item=ip link add lsr101 type veth peer name peerlsr101) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add lsr101 type veth peer name peerlsr101", "skip_reason": "Conditional result was False" } 49116 1727204713.99337: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204713.99341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204713.99344: variable 'omit' from source: magic vars 49116 1727204713.99398: variable 'ansible_distribution_major_version' from source: facts 49116 1727204713.99402: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204713.99535: variable 'type' from source: play vars 49116 1727204713.99539: variable 'state' from source: include params 49116 1727204713.99542: variable 'interface' from source: play vars 49116 1727204713.99545: variable 'current_interfaces' from source: set_fact 49116 1727204713.99550: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 49116 1727204713.99553: when evaluation is False, skipping this task 49116 1727204713.99675: variable 'item' from source: unknown 49116 1727204713.99678: variable 'item' from source: unknown skipping: [managed-node3] => (item=ip link set peerlsr101 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerlsr101 up", "skip_reason": "Conditional result was False" } 49116 1727204713.99757: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204713.99760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204713.99762: variable 'omit' from source: magic vars 49116 1727204713.99830: variable 'ansible_distribution_major_version' from source: facts 49116 1727204713.99833: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204713.99975: variable 'type' from source: play vars 49116 1727204713.99979: variable 'state' from source: include params 49116 1727204713.99982: variable 'interface' from source: play vars 49116 1727204713.99990: variable 'current_interfaces' from source: set_fact 49116 1727204714.00000: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 49116 1727204714.00003: when evaluation is False, skipping this task 49116 1727204714.00022: variable 'item' from source: unknown 49116 1727204714.00070: variable 'item' from source: unknown skipping: [managed-node3] => (item=ip link set lsr101 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set lsr101 up", "skip_reason": "Conditional result was False" } 49116 1727204714.00156: dumping result to json 49116 1727204714.00158: done dumping result, returning 49116 1727204714.00161: done running TaskExecutor() for managed-node3/TASK: Create veth interface lsr101 [127b8e07-fff9-02f7-957b-000000000940] 49116 1727204714.00163: sending task result for task 127b8e07-fff9-02f7-957b-000000000940 49116 1727204714.00203: done sending task result for task 127b8e07-fff9-02f7-957b-000000000940 49116 1727204714.00207: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false } MSG: All items skipped 49116 1727204714.00248: no more pending results, returning what we have 49116 1727204714.00251: results queue empty 49116 1727204714.00252: checking for any_errors_fatal 49116 1727204714.00261: done checking for any_errors_fatal 49116 1727204714.00262: checking for max_fail_percentage 49116 1727204714.00263: done checking for max_fail_percentage 49116 1727204714.00264: checking to see if all hosts have failed and the running result is not ok 49116 1727204714.00267: done checking to see if all hosts have failed 49116 1727204714.00268: getting the remaining hosts for this loop 49116 1727204714.00269: done getting the remaining hosts for this loop 49116 1727204714.00274: getting the next task for host managed-node3 49116 1727204714.00281: done getting next task for host managed-node3 49116 1727204714.00284: ^ task is: TASK: Set up veth as managed by NetworkManager 49116 1727204714.00287: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204714.00291: getting variables 49116 1727204714.00292: in VariableManager get_vars() 49116 1727204714.00341: Calling all_inventory to load vars for managed-node3 49116 1727204714.00344: Calling groups_inventory to load vars for managed-node3 49116 1727204714.00346: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204714.00360: Calling all_plugins_play to load vars for managed-node3 49116 1727204714.00362: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204714.00367: Calling groups_plugins_play to load vars for managed-node3 49116 1727204714.01557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204714.03340: done with get_vars() 49116 1727204714.03373: done getting variables 49116 1727204714.03422: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 15:05:14 -0400 (0:00:00.060) 0:00:37.059 ***** 49116 1727204714.03451: entering _queue_task() for managed-node3/command 49116 1727204714.03740: worker is 1 (out of 1 available) 49116 1727204714.03756: exiting _queue_task() for managed-node3/command 49116 1727204714.03771: done queuing things up, now waiting for results queue to drain 49116 1727204714.03773: waiting for pending results... 49116 1727204714.03978: running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager 49116 1727204714.04066: in run() - task 127b8e07-fff9-02f7-957b-000000000941 49116 1727204714.04078: variable 'ansible_search_path' from source: unknown 49116 1727204714.04081: variable 'ansible_search_path' from source: unknown 49116 1727204714.04117: calling self._execute() 49116 1727204714.04205: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204714.04209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204714.04219: variable 'omit' from source: magic vars 49116 1727204714.04551: variable 'ansible_distribution_major_version' from source: facts 49116 1727204714.04563: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204714.04689: variable 'type' from source: play vars 49116 1727204714.04693: variable 'state' from source: include params 49116 1727204714.04700: Evaluated conditional (type == 'veth' and state == 'present'): False 49116 1727204714.04703: when evaluation is False, skipping this task 49116 1727204714.04706: _execute() done 49116 1727204714.04708: dumping result to json 49116 1727204714.04712: done dumping result, returning 49116 1727204714.04718: done running TaskExecutor() for managed-node3/TASK: Set up veth as managed by NetworkManager [127b8e07-fff9-02f7-957b-000000000941] 49116 1727204714.04723: sending task result for task 127b8e07-fff9-02f7-957b-000000000941 skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 49116 1727204714.04896: no more pending results, returning what we have 49116 1727204714.04900: results queue empty 49116 1727204714.04901: checking for any_errors_fatal 49116 1727204714.04916: done checking for any_errors_fatal 49116 1727204714.04916: checking for max_fail_percentage 49116 1727204714.04919: done checking for max_fail_percentage 49116 1727204714.04920: checking to see if all hosts have failed and the running result is not ok 49116 1727204714.04920: done checking to see if all hosts have failed 49116 1727204714.04921: getting the remaining hosts for this loop 49116 1727204714.04922: done getting the remaining hosts for this loop 49116 1727204714.04927: getting the next task for host managed-node3 49116 1727204714.04936: done getting next task for host managed-node3 49116 1727204714.04939: ^ task is: TASK: Delete veth interface {{ interface }} 49116 1727204714.04943: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204714.04948: getting variables 49116 1727204714.04949: in VariableManager get_vars() 49116 1727204714.05155: Calling all_inventory to load vars for managed-node3 49116 1727204714.05159: Calling groups_inventory to load vars for managed-node3 49116 1727204714.05161: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204714.05176: Calling all_plugins_play to load vars for managed-node3 49116 1727204714.05179: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204714.05183: Calling groups_plugins_play to load vars for managed-node3 49116 1727204714.05776: done sending task result for task 127b8e07-fff9-02f7-957b-000000000941 49116 1727204714.05779: WORKER PROCESS EXITING 49116 1727204714.07919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204714.12590: done with get_vars() 49116 1727204714.12634: done getting variables 49116 1727204714.12818: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204714.13156: variable 'interface' from source: play vars TASK [Delete veth interface lsr101] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 15:05:14 -0400 (0:00:00.097) 0:00:37.156 ***** 49116 1727204714.13198: entering _queue_task() for managed-node3/command 49116 1727204714.14208: worker is 1 (out of 1 available) 49116 1727204714.14221: exiting _queue_task() for managed-node3/command 49116 1727204714.14236: done queuing things up, now waiting for results queue to drain 49116 1727204714.14237: waiting for pending results... 49116 1727204714.14687: running TaskExecutor() for managed-node3/TASK: Delete veth interface lsr101 49116 1727204714.14974: in run() - task 127b8e07-fff9-02f7-957b-000000000942 49116 1727204714.14979: variable 'ansible_search_path' from source: unknown 49116 1727204714.14982: variable 'ansible_search_path' from source: unknown 49116 1727204714.15004: calling self._execute() 49116 1727204714.15187: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204714.15205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204714.15224: variable 'omit' from source: magic vars 49116 1727204714.15670: variable 'ansible_distribution_major_version' from source: facts 49116 1727204714.15694: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204714.15934: variable 'type' from source: play vars 49116 1727204714.15945: variable 'state' from source: include params 49116 1727204714.15954: variable 'interface' from source: play vars 49116 1727204714.15962: variable 'current_interfaces' from source: set_fact 49116 1727204714.15977: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 49116 1727204714.15989: variable 'omit' from source: magic vars 49116 1727204714.16039: variable 'omit' from source: magic vars 49116 1727204714.16221: variable 'interface' from source: play vars 49116 1727204714.16225: variable 'omit' from source: magic vars 49116 1727204714.16233: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204714.16283: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204714.16309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204714.16338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204714.16357: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204714.16395: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204714.16403: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204714.16411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204714.16527: Set connection var ansible_connection to ssh 49116 1727204714.16552: Set connection var ansible_timeout to 10 49116 1727204714.16569: Set connection var ansible_shell_executable to /bin/sh 49116 1727204714.16653: Set connection var ansible_pipelining to False 49116 1727204714.16656: Set connection var ansible_shell_type to sh 49116 1727204714.16658: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204714.16661: variable 'ansible_shell_executable' from source: unknown 49116 1727204714.16663: variable 'ansible_connection' from source: unknown 49116 1727204714.16666: variable 'ansible_module_compression' from source: unknown 49116 1727204714.16668: variable 'ansible_shell_type' from source: unknown 49116 1727204714.16671: variable 'ansible_shell_executable' from source: unknown 49116 1727204714.16673: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204714.16675: variable 'ansible_pipelining' from source: unknown 49116 1727204714.16677: variable 'ansible_timeout' from source: unknown 49116 1727204714.16679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204714.16826: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204714.16846: variable 'omit' from source: magic vars 49116 1727204714.16860: starting attempt loop 49116 1727204714.16873: running the handler 49116 1727204714.16899: _low_level_execute_command(): starting 49116 1727204714.16913: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204714.17720: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204714.17744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204714.17758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204714.17856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204714.17890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204714.17907: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204714.17927: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204714.18040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204714.19896: stdout chunk (state=3): >>>/root <<< 49116 1727204714.20078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204714.20114: stdout chunk (state=3): >>><<< 49116 1727204714.20117: stderr chunk (state=3): >>><<< 49116 1727204714.20243: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204714.20248: _low_level_execute_command(): starting 49116 1727204714.20251: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204714.2014227-51085-141902468936646 `" && echo ansible-tmp-1727204714.2014227-51085-141902468936646="` echo /root/.ansible/tmp/ansible-tmp-1727204714.2014227-51085-141902468936646 `" ) && sleep 0' 49116 1727204714.20893: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204714.20929: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204714.20948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204714.21057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204714.23257: stdout chunk (state=3): >>>ansible-tmp-1727204714.2014227-51085-141902468936646=/root/.ansible/tmp/ansible-tmp-1727204714.2014227-51085-141902468936646 <<< 49116 1727204714.23484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204714.23488: stdout chunk (state=3): >>><<< 49116 1727204714.23491: stderr chunk (state=3): >>><<< 49116 1727204714.23674: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204714.2014227-51085-141902468936646=/root/.ansible/tmp/ansible-tmp-1727204714.2014227-51085-141902468936646 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204714.23678: variable 'ansible_module_compression' from source: unknown 49116 1727204714.23680: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 49116 1727204714.23682: variable 'ansible_facts' from source: unknown 49116 1727204714.23745: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204714.2014227-51085-141902468936646/AnsiballZ_command.py 49116 1727204714.24017: Sending initial data 49116 1727204714.24037: Sent initial data (156 bytes) 49116 1727204714.24725: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204714.24787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration <<< 49116 1727204714.24803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204714.24885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204714.24918: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204714.24954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204714.25006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204714.25090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204714.26896: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204714.26995: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204714.27049: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpskptf3sh /root/.ansible/tmp/ansible-tmp-1727204714.2014227-51085-141902468936646/AnsiballZ_command.py <<< 49116 1727204714.27110: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204714.2014227-51085-141902468936646/AnsiballZ_command.py" <<< 49116 1727204714.27172: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpskptf3sh" to remote "/root/.ansible/tmp/ansible-tmp-1727204714.2014227-51085-141902468936646/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204714.2014227-51085-141902468936646/AnsiballZ_command.py" <<< 49116 1727204714.28424: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204714.28428: stdout chunk (state=3): >>><<< 49116 1727204714.28431: stderr chunk (state=3): >>><<< 49116 1727204714.28435: done transferring module to remote 49116 1727204714.28438: _low_level_execute_command(): starting 49116 1727204714.28440: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204714.2014227-51085-141902468936646/ /root/.ansible/tmp/ansible-tmp-1727204714.2014227-51085-141902468936646/AnsiballZ_command.py && sleep 0' 49116 1727204714.29072: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204714.29091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204714.29104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204714.29123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204714.29143: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 <<< 49116 1727204714.29249: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204714.29278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204714.29395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204714.31568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204714.31609: stderr chunk (state=3): >>><<< 49116 1727204714.31613: stdout chunk (state=3): >>><<< 49116 1727204714.31726: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204714.31730: _low_level_execute_command(): starting 49116 1727204714.31733: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204714.2014227-51085-141902468936646/AnsiballZ_command.py && sleep 0' 49116 1727204714.32472: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204714.32491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204714.32507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204714.32526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49116 1727204714.32590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204714.32672: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204714.32705: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204714.32769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204714.32846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204714.51416: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr101", "type", "veth"], "start": "2024-09-24 15:05:14.503015", "end": "2024-09-24 15:05:14.512429", "delta": "0:00:00.009414", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr101 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49116 1727204714.54498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204714.54566: stderr chunk (state=3): >>><<< 49116 1727204714.54571: stdout chunk (state=3): >>><<< 49116 1727204714.54588: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr101", "type", "veth"], "start": "2024-09-24 15:05:14.503015", "end": "2024-09-24 15:05:14.512429", "delta": "0:00:00.009414", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr101 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204714.54619: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del lsr101 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204714.2014227-51085-141902468936646/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204714.54628: _low_level_execute_command(): starting 49116 1727204714.54639: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204714.2014227-51085-141902468936646/ > /dev/null 2>&1 && sleep 0' 49116 1727204714.55138: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204714.55142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204714.55145: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204714.55147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204714.55207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204714.55210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204714.55292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204714.57344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204714.57403: stderr chunk (state=3): >>><<< 49116 1727204714.57406: stdout chunk (state=3): >>><<< 49116 1727204714.57423: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204714.57430: handler run complete 49116 1727204714.57455: Evaluated conditional (False): False 49116 1727204714.57468: attempt loop complete, returning result 49116 1727204714.57471: _execute() done 49116 1727204714.57474: dumping result to json 49116 1727204714.57480: done dumping result, returning 49116 1727204714.57488: done running TaskExecutor() for managed-node3/TASK: Delete veth interface lsr101 [127b8e07-fff9-02f7-957b-000000000942] 49116 1727204714.57493: sending task result for task 127b8e07-fff9-02f7-957b-000000000942 49116 1727204714.57599: done sending task result for task 127b8e07-fff9-02f7-957b-000000000942 49116 1727204714.57602: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "link", "del", "lsr101", "type", "veth" ], "delta": "0:00:00.009414", "end": "2024-09-24 15:05:14.512429", "rc": 0, "start": "2024-09-24 15:05:14.503015" } 49116 1727204714.57677: no more pending results, returning what we have 49116 1727204714.57681: results queue empty 49116 1727204714.57682: checking for any_errors_fatal 49116 1727204714.57687: done checking for any_errors_fatal 49116 1727204714.57688: checking for max_fail_percentage 49116 1727204714.57689: done checking for max_fail_percentage 49116 1727204714.57690: checking to see if all hosts have failed and the running result is not ok 49116 1727204714.57691: done checking to see if all hosts have failed 49116 1727204714.57692: getting the remaining hosts for this loop 49116 1727204714.57694: done getting the remaining hosts for this loop 49116 1727204714.57699: getting the next task for host managed-node3 49116 1727204714.57706: done getting next task for host managed-node3 49116 1727204714.57709: ^ task is: TASK: Create dummy interface {{ interface }} 49116 1727204714.57719: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204714.57724: getting variables 49116 1727204714.57725: in VariableManager get_vars() 49116 1727204714.57772: Calling all_inventory to load vars for managed-node3 49116 1727204714.57775: Calling groups_inventory to load vars for managed-node3 49116 1727204714.57777: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204714.57789: Calling all_plugins_play to load vars for managed-node3 49116 1727204714.57792: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204714.57795: Calling groups_plugins_play to load vars for managed-node3 49116 1727204714.58850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204714.60180: done with get_vars() 49116 1727204714.60201: done getting variables 49116 1727204714.60255: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204714.60351: variable 'interface' from source: play vars TASK [Create dummy interface lsr101] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 15:05:14 -0400 (0:00:00.471) 0:00:37.628 ***** 49116 1727204714.60377: entering _queue_task() for managed-node3/command 49116 1727204714.60685: worker is 1 (out of 1 available) 49116 1727204714.60702: exiting _queue_task() for managed-node3/command 49116 1727204714.60716: done queuing things up, now waiting for results queue to drain 49116 1727204714.60717: waiting for pending results... 49116 1727204714.60922: running TaskExecutor() for managed-node3/TASK: Create dummy interface lsr101 49116 1727204714.60998: in run() - task 127b8e07-fff9-02f7-957b-000000000943 49116 1727204714.61014: variable 'ansible_search_path' from source: unknown 49116 1727204714.61018: variable 'ansible_search_path' from source: unknown 49116 1727204714.61051: calling self._execute() 49116 1727204714.61150: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204714.61154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204714.61167: variable 'omit' from source: magic vars 49116 1727204714.61500: variable 'ansible_distribution_major_version' from source: facts 49116 1727204714.61511: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204714.61677: variable 'type' from source: play vars 49116 1727204714.61681: variable 'state' from source: include params 49116 1727204714.61684: variable 'interface' from source: play vars 49116 1727204714.61690: variable 'current_interfaces' from source: set_fact 49116 1727204714.61698: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 49116 1727204714.61700: when evaluation is False, skipping this task 49116 1727204714.61705: _execute() done 49116 1727204714.61707: dumping result to json 49116 1727204714.61710: done dumping result, returning 49116 1727204714.61720: done running TaskExecutor() for managed-node3/TASK: Create dummy interface lsr101 [127b8e07-fff9-02f7-957b-000000000943] 49116 1727204714.61722: sending task result for task 127b8e07-fff9-02f7-957b-000000000943 49116 1727204714.61817: done sending task result for task 127b8e07-fff9-02f7-957b-000000000943 49116 1727204714.61822: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 49116 1727204714.61878: no more pending results, returning what we have 49116 1727204714.61882: results queue empty 49116 1727204714.61883: checking for any_errors_fatal 49116 1727204714.61892: done checking for any_errors_fatal 49116 1727204714.61892: checking for max_fail_percentage 49116 1727204714.61894: done checking for max_fail_percentage 49116 1727204714.61895: checking to see if all hosts have failed and the running result is not ok 49116 1727204714.61896: done checking to see if all hosts have failed 49116 1727204714.61897: getting the remaining hosts for this loop 49116 1727204714.61898: done getting the remaining hosts for this loop 49116 1727204714.61903: getting the next task for host managed-node3 49116 1727204714.61910: done getting next task for host managed-node3 49116 1727204714.61913: ^ task is: TASK: Delete dummy interface {{ interface }} 49116 1727204714.61916: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204714.61921: getting variables 49116 1727204714.61922: in VariableManager get_vars() 49116 1727204714.61979: Calling all_inventory to load vars for managed-node3 49116 1727204714.61982: Calling groups_inventory to load vars for managed-node3 49116 1727204714.61984: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204714.61996: Calling all_plugins_play to load vars for managed-node3 49116 1727204714.61999: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204714.62002: Calling groups_plugins_play to load vars for managed-node3 49116 1727204714.63061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204714.64276: done with get_vars() 49116 1727204714.64305: done getting variables 49116 1727204714.64358: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204714.64455: variable 'interface' from source: play vars TASK [Delete dummy interface lsr101] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 15:05:14 -0400 (0:00:00.041) 0:00:37.669 ***** 49116 1727204714.64482: entering _queue_task() for managed-node3/command 49116 1727204714.64775: worker is 1 (out of 1 available) 49116 1727204714.64792: exiting _queue_task() for managed-node3/command 49116 1727204714.64807: done queuing things up, now waiting for results queue to drain 49116 1727204714.64808: waiting for pending results... 49116 1727204714.65022: running TaskExecutor() for managed-node3/TASK: Delete dummy interface lsr101 49116 1727204714.65101: in run() - task 127b8e07-fff9-02f7-957b-000000000944 49116 1727204714.65106: variable 'ansible_search_path' from source: unknown 49116 1727204714.65110: variable 'ansible_search_path' from source: unknown 49116 1727204714.65147: calling self._execute() 49116 1727204714.65235: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204714.65243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204714.65253: variable 'omit' from source: magic vars 49116 1727204714.65569: variable 'ansible_distribution_major_version' from source: facts 49116 1727204714.65582: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204714.65733: variable 'type' from source: play vars 49116 1727204714.65741: variable 'state' from source: include params 49116 1727204714.65745: variable 'interface' from source: play vars 49116 1727204714.65751: variable 'current_interfaces' from source: set_fact 49116 1727204714.65761: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 49116 1727204714.65764: when evaluation is False, skipping this task 49116 1727204714.65768: _execute() done 49116 1727204714.65771: dumping result to json 49116 1727204714.65774: done dumping result, returning 49116 1727204714.65776: done running TaskExecutor() for managed-node3/TASK: Delete dummy interface lsr101 [127b8e07-fff9-02f7-957b-000000000944] 49116 1727204714.65783: sending task result for task 127b8e07-fff9-02f7-957b-000000000944 49116 1727204714.65876: done sending task result for task 127b8e07-fff9-02f7-957b-000000000944 49116 1727204714.65879: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 49116 1727204714.65952: no more pending results, returning what we have 49116 1727204714.65956: results queue empty 49116 1727204714.65958: checking for any_errors_fatal 49116 1727204714.65967: done checking for any_errors_fatal 49116 1727204714.65968: checking for max_fail_percentage 49116 1727204714.65969: done checking for max_fail_percentage 49116 1727204714.65970: checking to see if all hosts have failed and the running result is not ok 49116 1727204714.65971: done checking to see if all hosts have failed 49116 1727204714.65972: getting the remaining hosts for this loop 49116 1727204714.65973: done getting the remaining hosts for this loop 49116 1727204714.65978: getting the next task for host managed-node3 49116 1727204714.65985: done getting next task for host managed-node3 49116 1727204714.65989: ^ task is: TASK: Create tap interface {{ interface }} 49116 1727204714.65992: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204714.65996: getting variables 49116 1727204714.65998: in VariableManager get_vars() 49116 1727204714.66042: Calling all_inventory to load vars for managed-node3 49116 1727204714.66045: Calling groups_inventory to load vars for managed-node3 49116 1727204714.66047: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204714.66058: Calling all_plugins_play to load vars for managed-node3 49116 1727204714.66061: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204714.66063: Calling groups_plugins_play to load vars for managed-node3 49116 1727204714.67251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204714.68446: done with get_vars() 49116 1727204714.68483: done getting variables 49116 1727204714.68534: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204714.68628: variable 'interface' from source: play vars TASK [Create tap interface lsr101] ********************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 15:05:14 -0400 (0:00:00.041) 0:00:37.711 ***** 49116 1727204714.68654: entering _queue_task() for managed-node3/command 49116 1727204714.68954: worker is 1 (out of 1 available) 49116 1727204714.68972: exiting _queue_task() for managed-node3/command 49116 1727204714.68986: done queuing things up, now waiting for results queue to drain 49116 1727204714.68987: waiting for pending results... 49116 1727204714.69191: running TaskExecutor() for managed-node3/TASK: Create tap interface lsr101 49116 1727204714.69278: in run() - task 127b8e07-fff9-02f7-957b-000000000945 49116 1727204714.69291: variable 'ansible_search_path' from source: unknown 49116 1727204714.69295: variable 'ansible_search_path' from source: unknown 49116 1727204714.69332: calling self._execute() 49116 1727204714.69423: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204714.69429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204714.69440: variable 'omit' from source: magic vars 49116 1727204714.69761: variable 'ansible_distribution_major_version' from source: facts 49116 1727204714.69774: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204714.69929: variable 'type' from source: play vars 49116 1727204714.69933: variable 'state' from source: include params 49116 1727204714.69941: variable 'interface' from source: play vars 49116 1727204714.69945: variable 'current_interfaces' from source: set_fact 49116 1727204714.69953: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 49116 1727204714.69956: when evaluation is False, skipping this task 49116 1727204714.69959: _execute() done 49116 1727204714.69962: dumping result to json 49116 1727204714.69965: done dumping result, returning 49116 1727204714.69973: done running TaskExecutor() for managed-node3/TASK: Create tap interface lsr101 [127b8e07-fff9-02f7-957b-000000000945] 49116 1727204714.69981: sending task result for task 127b8e07-fff9-02f7-957b-000000000945 49116 1727204714.70076: done sending task result for task 127b8e07-fff9-02f7-957b-000000000945 49116 1727204714.70079: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 49116 1727204714.70140: no more pending results, returning what we have 49116 1727204714.70144: results queue empty 49116 1727204714.70146: checking for any_errors_fatal 49116 1727204714.70153: done checking for any_errors_fatal 49116 1727204714.70153: checking for max_fail_percentage 49116 1727204714.70155: done checking for max_fail_percentage 49116 1727204714.70157: checking to see if all hosts have failed and the running result is not ok 49116 1727204714.70157: done checking to see if all hosts have failed 49116 1727204714.70158: getting the remaining hosts for this loop 49116 1727204714.70159: done getting the remaining hosts for this loop 49116 1727204714.70164: getting the next task for host managed-node3 49116 1727204714.70173: done getting next task for host managed-node3 49116 1727204714.70177: ^ task is: TASK: Delete tap interface {{ interface }} 49116 1727204714.70180: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204714.70185: getting variables 49116 1727204714.70186: in VariableManager get_vars() 49116 1727204714.70233: Calling all_inventory to load vars for managed-node3 49116 1727204714.70236: Calling groups_inventory to load vars for managed-node3 49116 1727204714.70238: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204714.70251: Calling all_plugins_play to load vars for managed-node3 49116 1727204714.70254: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204714.70256: Calling groups_plugins_play to load vars for managed-node3 49116 1727204714.71313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204714.72650: done with get_vars() 49116 1727204714.72675: done getting variables 49116 1727204714.72726: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49116 1727204714.72820: variable 'interface' from source: play vars TASK [Delete tap interface lsr101] ********************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 15:05:14 -0400 (0:00:00.041) 0:00:37.753 ***** 49116 1727204714.72846: entering _queue_task() for managed-node3/command 49116 1727204714.73140: worker is 1 (out of 1 available) 49116 1727204714.73156: exiting _queue_task() for managed-node3/command 49116 1727204714.73170: done queuing things up, now waiting for results queue to drain 49116 1727204714.73172: waiting for pending results... 49116 1727204714.73394: running TaskExecutor() for managed-node3/TASK: Delete tap interface lsr101 49116 1727204714.73473: in run() - task 127b8e07-fff9-02f7-957b-000000000946 49116 1727204714.73487: variable 'ansible_search_path' from source: unknown 49116 1727204714.73491: variable 'ansible_search_path' from source: unknown 49116 1727204714.73529: calling self._execute() 49116 1727204714.73618: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204714.73622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204714.73632: variable 'omit' from source: magic vars 49116 1727204714.73954: variable 'ansible_distribution_major_version' from source: facts 49116 1727204714.73968: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204714.74118: variable 'type' from source: play vars 49116 1727204714.74122: variable 'state' from source: include params 49116 1727204714.74128: variable 'interface' from source: play vars 49116 1727204714.74131: variable 'current_interfaces' from source: set_fact 49116 1727204714.74140: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 49116 1727204714.74143: when evaluation is False, skipping this task 49116 1727204714.74146: _execute() done 49116 1727204714.74149: dumping result to json 49116 1727204714.74152: done dumping result, returning 49116 1727204714.74158: done running TaskExecutor() for managed-node3/TASK: Delete tap interface lsr101 [127b8e07-fff9-02f7-957b-000000000946] 49116 1727204714.74163: sending task result for task 127b8e07-fff9-02f7-957b-000000000946 49116 1727204714.74260: done sending task result for task 127b8e07-fff9-02f7-957b-000000000946 49116 1727204714.74264: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 49116 1727204714.74350: no more pending results, returning what we have 49116 1727204714.74354: results queue empty 49116 1727204714.74355: checking for any_errors_fatal 49116 1727204714.74362: done checking for any_errors_fatal 49116 1727204714.74363: checking for max_fail_percentage 49116 1727204714.74367: done checking for max_fail_percentage 49116 1727204714.74368: checking to see if all hosts have failed and the running result is not ok 49116 1727204714.74369: done checking to see if all hosts have failed 49116 1727204714.74369: getting the remaining hosts for this loop 49116 1727204714.74371: done getting the remaining hosts for this loop 49116 1727204714.74375: getting the next task for host managed-node3 49116 1727204714.74387: done getting next task for host managed-node3 49116 1727204714.74390: ^ task is: TASK: Verify network state restored to default 49116 1727204714.74394: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204714.74398: getting variables 49116 1727204714.74400: in VariableManager get_vars() 49116 1727204714.74442: Calling all_inventory to load vars for managed-node3 49116 1727204714.74445: Calling groups_inventory to load vars for managed-node3 49116 1727204714.74447: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204714.74459: Calling all_plugins_play to load vars for managed-node3 49116 1727204714.74461: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204714.74464: Calling groups_plugins_play to load vars for managed-node3 49116 1727204714.76026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204714.77296: done with get_vars() 49116 1727204714.77325: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:77 Tuesday 24 September 2024 15:05:14 -0400 (0:00:00.045) 0:00:37.799 ***** 49116 1727204714.77409: entering _queue_task() for managed-node3/include_tasks 49116 1727204714.77709: worker is 1 (out of 1 available) 49116 1727204714.77725: exiting _queue_task() for managed-node3/include_tasks 49116 1727204714.77741: done queuing things up, now waiting for results queue to drain 49116 1727204714.77743: waiting for pending results... 49116 1727204714.77939: running TaskExecutor() for managed-node3/TASK: Verify network state restored to default 49116 1727204714.78062: in run() - task 127b8e07-fff9-02f7-957b-0000000000ab 49116 1727204714.78271: variable 'ansible_search_path' from source: unknown 49116 1727204714.78276: calling self._execute() 49116 1727204714.78279: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204714.78282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204714.78285: variable 'omit' from source: magic vars 49116 1727204714.79018: variable 'ansible_distribution_major_version' from source: facts 49116 1727204714.79041: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204714.79053: _execute() done 49116 1727204714.79062: dumping result to json 49116 1727204714.79071: done dumping result, returning 49116 1727204714.79081: done running TaskExecutor() for managed-node3/TASK: Verify network state restored to default [127b8e07-fff9-02f7-957b-0000000000ab] 49116 1727204714.79097: sending task result for task 127b8e07-fff9-02f7-957b-0000000000ab 49116 1727204714.79254: no more pending results, returning what we have 49116 1727204714.79260: in VariableManager get_vars() 49116 1727204714.79360: Calling all_inventory to load vars for managed-node3 49116 1727204714.79363: Calling groups_inventory to load vars for managed-node3 49116 1727204714.79367: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204714.79387: Calling all_plugins_play to load vars for managed-node3 49116 1727204714.79390: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204714.79394: Calling groups_plugins_play to load vars for managed-node3 49116 1727204714.80238: done sending task result for task 127b8e07-fff9-02f7-957b-0000000000ab 49116 1727204714.80244: WORKER PROCESS EXITING 49116 1727204714.81846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204714.84164: done with get_vars() 49116 1727204714.84206: variable 'ansible_search_path' from source: unknown 49116 1727204714.84226: we have included files to process 49116 1727204714.84227: generating all_blocks data 49116 1727204714.84230: done generating all_blocks data 49116 1727204714.84238: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 49116 1727204714.84239: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 49116 1727204714.84242: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 49116 1727204714.84732: done processing included file 49116 1727204714.84737: iterating over new_blocks loaded from include file 49116 1727204714.84739: in VariableManager get_vars() 49116 1727204714.84761: done with get_vars() 49116 1727204714.84762: filtering new block on tags 49116 1727204714.84786: done filtering new block on tags 49116 1727204714.84789: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node3 49116 1727204714.84795: extending task lists for all hosts with included blocks 49116 1727204714.88705: done extending task lists 49116 1727204714.88708: done processing included files 49116 1727204714.88709: results queue empty 49116 1727204714.88709: checking for any_errors_fatal 49116 1727204714.88713: done checking for any_errors_fatal 49116 1727204714.88714: checking for max_fail_percentage 49116 1727204714.88715: done checking for max_fail_percentage 49116 1727204714.88721: checking to see if all hosts have failed and the running result is not ok 49116 1727204714.88722: done checking to see if all hosts have failed 49116 1727204714.88723: getting the remaining hosts for this loop 49116 1727204714.88724: done getting the remaining hosts for this loop 49116 1727204714.88727: getting the next task for host managed-node3 49116 1727204714.88732: done getting next task for host managed-node3 49116 1727204714.88737: ^ task is: TASK: Check routes and DNS 49116 1727204714.88739: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204714.88742: getting variables 49116 1727204714.88743: in VariableManager get_vars() 49116 1727204714.88763: Calling all_inventory to load vars for managed-node3 49116 1727204714.88767: Calling groups_inventory to load vars for managed-node3 49116 1727204714.88770: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204714.88778: Calling all_plugins_play to load vars for managed-node3 49116 1727204714.88780: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204714.88783: Calling groups_plugins_play to load vars for managed-node3 49116 1727204714.90627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204714.92954: done with get_vars() 49116 1727204714.92997: done getting variables 49116 1727204714.93059: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 15:05:14 -0400 (0:00:00.156) 0:00:37.955 ***** 49116 1727204714.93095: entering _queue_task() for managed-node3/shell 49116 1727204714.93522: worker is 1 (out of 1 available) 49116 1727204714.93540: exiting _queue_task() for managed-node3/shell 49116 1727204714.93553: done queuing things up, now waiting for results queue to drain 49116 1727204714.93554: waiting for pending results... 49116 1727204714.93773: running TaskExecutor() for managed-node3/TASK: Check routes and DNS 49116 1727204714.93897: in run() - task 127b8e07-fff9-02f7-957b-000000000b17 49116 1727204714.93921: variable 'ansible_search_path' from source: unknown 49116 1727204714.93928: variable 'ansible_search_path' from source: unknown 49116 1727204714.93976: calling self._execute() 49116 1727204714.94091: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204714.94103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204714.94271: variable 'omit' from source: magic vars 49116 1727204714.94522: variable 'ansible_distribution_major_version' from source: facts 49116 1727204714.94541: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204714.94553: variable 'omit' from source: magic vars 49116 1727204714.94602: variable 'omit' from source: magic vars 49116 1727204714.94644: variable 'omit' from source: magic vars 49116 1727204714.94697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49116 1727204714.94744: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49116 1727204714.94773: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49116 1727204714.94801: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204714.94820: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49116 1727204714.94856: variable 'inventory_hostname' from source: host vars for 'managed-node3' 49116 1727204714.94867: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204714.94875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204714.94983: Set connection var ansible_connection to ssh 49116 1727204714.95003: Set connection var ansible_timeout to 10 49116 1727204714.95017: Set connection var ansible_shell_executable to /bin/sh 49116 1727204714.95027: Set connection var ansible_pipelining to False 49116 1727204714.95035: Set connection var ansible_shell_type to sh 49116 1727204714.95046: Set connection var ansible_module_compression to ZIP_DEFLATED 49116 1727204714.95079: variable 'ansible_shell_executable' from source: unknown 49116 1727204714.95087: variable 'ansible_connection' from source: unknown 49116 1727204714.95096: variable 'ansible_module_compression' from source: unknown 49116 1727204714.95103: variable 'ansible_shell_type' from source: unknown 49116 1727204714.95110: variable 'ansible_shell_executable' from source: unknown 49116 1727204714.95118: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204714.95126: variable 'ansible_pipelining' from source: unknown 49116 1727204714.95133: variable 'ansible_timeout' from source: unknown 49116 1727204714.95142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204714.95301: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204714.95324: variable 'omit' from source: magic vars 49116 1727204714.95470: starting attempt loop 49116 1727204714.95473: running the handler 49116 1727204714.95476: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49116 1727204714.95478: _low_level_execute_command(): starting 49116 1727204714.95480: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49116 1727204714.96234: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found <<< 49116 1727204714.96251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204714.96345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204714.96388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204714.96458: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204714.98335: stdout chunk (state=3): >>>/root <<< 49116 1727204714.98523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204714.98543: stdout chunk (state=3): >>><<< 49116 1727204714.98557: stderr chunk (state=3): >>><<< 49116 1727204714.98592: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204714.98614: _low_level_execute_command(): starting 49116 1727204714.98629: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204714.9859915-51110-62189302531469 `" && echo ansible-tmp-1727204714.9859915-51110-62189302531469="` echo /root/.ansible/tmp/ansible-tmp-1727204714.9859915-51110-62189302531469 `" ) && sleep 0' 49116 1727204714.99409: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204714.99511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204714.99515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204714.99563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204714.99642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204715.01861: stdout chunk (state=3): >>>ansible-tmp-1727204714.9859915-51110-62189302531469=/root/.ansible/tmp/ansible-tmp-1727204714.9859915-51110-62189302531469 <<< 49116 1727204715.02076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204715.02098: stderr chunk (state=3): >>><<< 49116 1727204715.02117: stdout chunk (state=3): >>><<< 49116 1727204715.02138: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204714.9859915-51110-62189302531469=/root/.ansible/tmp/ansible-tmp-1727204714.9859915-51110-62189302531469 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204715.02198: variable 'ansible_module_compression' from source: unknown 49116 1727204715.02301: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49116o0a1nkjo/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 49116 1727204715.02314: variable 'ansible_facts' from source: unknown 49116 1727204715.02403: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204714.9859915-51110-62189302531469/AnsiballZ_command.py 49116 1727204715.02705: Sending initial data 49116 1727204715.02708: Sent initial data (155 bytes) 49116 1727204715.03374: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204715.03391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49116 1727204715.03494: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204715.03527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204715.03643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204715.05459: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 49116 1727204715.05500: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49116 1727204715.05564: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49116 1727204715.05661: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpgf73wc32 /root/.ansible/tmp/ansible-tmp-1727204714.9859915-51110-62189302531469/AnsiballZ_command.py <<< 49116 1727204715.05664: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204714.9859915-51110-62189302531469/AnsiballZ_command.py" <<< 49116 1727204715.05731: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49116o0a1nkjo/tmpgf73wc32" to remote "/root/.ansible/tmp/ansible-tmp-1727204714.9859915-51110-62189302531469/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204714.9859915-51110-62189302531469/AnsiballZ_command.py" <<< 49116 1727204715.06702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204715.06755: stderr chunk (state=3): >>><<< 49116 1727204715.06771: stdout chunk (state=3): >>><<< 49116 1727204715.06814: done transferring module to remote 49116 1727204715.06836: _low_level_execute_command(): starting 49116 1727204715.06872: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204714.9859915-51110-62189302531469/ /root/.ansible/tmp/ansible-tmp-1727204714.9859915-51110-62189302531469/AnsiballZ_command.py && sleep 0' 49116 1727204715.07606: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49116 1727204715.07627: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49116 1727204715.07759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204715.07816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204715.07897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204715.10046: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204715.10051: stdout chunk (state=3): >>><<< 49116 1727204715.10058: stderr chunk (state=3): >>><<< 49116 1727204715.10111: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204715.10114: _low_level_execute_command(): starting 49116 1727204715.10117: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204714.9859915-51110-62189302531469/AnsiballZ_command.py && sleep 0' 49116 1727204715.10917: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204715.10949: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204715.11074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204715.29695: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:aa:78:a8:9b:13 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.169/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3540sec preferred_lft 3540sec\n inet6 fe80::aa:78ff:fea8:9b13/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.169 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.169 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:05:15.285345", "end": "2024-09-24 15:05:15.295267", "delta": "0:00:00.009922", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49116 1727204715.31514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. <<< 49116 1727204715.31578: stderr chunk (state=3): >>><<< 49116 1727204715.31582: stdout chunk (state=3): >>><<< 49116 1727204715.31603: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:aa:78:a8:9b:13 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.169/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3540sec preferred_lft 3540sec\n inet6 fe80::aa:78ff:fea8:9b13/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.169 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.169 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:05:15.285345", "end": "2024-09-24 15:05:15.295267", "delta": "0:00:00.009922", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.169 closed. 49116 1727204715.31645: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204714.9859915-51110-62189302531469/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49116 1727204715.31654: _low_level_execute_command(): starting 49116 1727204715.31659: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204714.9859915-51110-62189302531469/ > /dev/null 2>&1 && sleep 0' 49116 1727204715.32149: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49116 1727204715.32153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204715.32157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49116 1727204715.32215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' <<< 49116 1727204715.32218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49116 1727204715.32225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49116 1727204715.32297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49116 1727204715.34328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49116 1727204715.34394: stderr chunk (state=3): >>><<< 49116 1727204715.34398: stdout chunk (state=3): >>><<< 49116 1727204715.34413: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.169 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.169 originally 10.31.45.169 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1846617821' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49116 1727204715.34421: handler run complete 49116 1727204715.34442: Evaluated conditional (False): False 49116 1727204715.34455: attempt loop complete, returning result 49116 1727204715.34458: _execute() done 49116 1727204715.34461: dumping result to json 49116 1727204715.34472: done dumping result, returning 49116 1727204715.34480: done running TaskExecutor() for managed-node3/TASK: Check routes and DNS [127b8e07-fff9-02f7-957b-000000000b17] 49116 1727204715.34485: sending task result for task 127b8e07-fff9-02f7-957b-000000000b17 49116 1727204715.34605: done sending task result for task 127b8e07-fff9-02f7-957b-000000000b17 49116 1727204715.34608: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009922", "end": "2024-09-24 15:05:15.295267", "rc": 0, "start": "2024-09-24 15:05:15.285345" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:aa:78:a8:9b:13 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.45.169/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 3540sec preferred_lft 3540sec inet6 fe80::aa:78ff:fea8:9b13/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.169 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.169 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 49116 1727204715.34696: no more pending results, returning what we have 49116 1727204715.34701: results queue empty 49116 1727204715.34702: checking for any_errors_fatal 49116 1727204715.34703: done checking for any_errors_fatal 49116 1727204715.34704: checking for max_fail_percentage 49116 1727204715.34706: done checking for max_fail_percentage 49116 1727204715.34707: checking to see if all hosts have failed and the running result is not ok 49116 1727204715.34708: done checking to see if all hosts have failed 49116 1727204715.34708: getting the remaining hosts for this loop 49116 1727204715.34710: done getting the remaining hosts for this loop 49116 1727204715.34714: getting the next task for host managed-node3 49116 1727204715.34729: done getting next task for host managed-node3 49116 1727204715.34735: ^ task is: TASK: Verify DNS and network connectivity 49116 1727204715.34742: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204715.34746: getting variables 49116 1727204715.34748: in VariableManager get_vars() 49116 1727204715.34789: Calling all_inventory to load vars for managed-node3 49116 1727204715.34792: Calling groups_inventory to load vars for managed-node3 49116 1727204715.34794: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204715.34805: Calling all_plugins_play to load vars for managed-node3 49116 1727204715.34808: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204715.34810: Calling groups_plugins_play to load vars for managed-node3 49116 1727204715.36265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204715.37837: done with get_vars() 49116 1727204715.37871: done getting variables 49116 1727204715.37922: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 15:05:15 -0400 (0:00:00.448) 0:00:38.404 ***** 49116 1727204715.37947: entering _queue_task() for managed-node3/shell 49116 1727204715.38245: worker is 1 (out of 1 available) 49116 1727204715.38262: exiting _queue_task() for managed-node3/shell 49116 1727204715.38277: done queuing things up, now waiting for results queue to drain 49116 1727204715.38279: waiting for pending results... 49116 1727204715.38493: running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity 49116 1727204715.38564: in run() - task 127b8e07-fff9-02f7-957b-000000000b18 49116 1727204715.38585: variable 'ansible_search_path' from source: unknown 49116 1727204715.38653: variable 'ansible_search_path' from source: unknown 49116 1727204715.38658: calling self._execute() 49116 1727204715.38749: variable 'ansible_host' from source: host vars for 'managed-node3' 49116 1727204715.38753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 49116 1727204715.38769: variable 'omit' from source: magic vars 49116 1727204715.39372: variable 'ansible_distribution_major_version' from source: facts 49116 1727204715.39376: Evaluated conditional (ansible_distribution_major_version != '6'): True 49116 1727204715.39379: variable 'ansible_facts' from source: unknown 49116 1727204715.40486: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 49116 1727204715.40498: when evaluation is False, skipping this task 49116 1727204715.40508: _execute() done 49116 1727204715.40519: dumping result to json 49116 1727204715.40526: done dumping result, returning 49116 1727204715.40538: done running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity [127b8e07-fff9-02f7-957b-000000000b18] 49116 1727204715.40547: sending task result for task 127b8e07-fff9-02f7-957b-000000000b18 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 49116 1727204715.40720: no more pending results, returning what we have 49116 1727204715.40725: results queue empty 49116 1727204715.40727: checking for any_errors_fatal 49116 1727204715.40742: done checking for any_errors_fatal 49116 1727204715.40744: checking for max_fail_percentage 49116 1727204715.40746: done checking for max_fail_percentage 49116 1727204715.40748: checking to see if all hosts have failed and the running result is not ok 49116 1727204715.40749: done checking to see if all hosts have failed 49116 1727204715.40750: getting the remaining hosts for this loop 49116 1727204715.40751: done getting the remaining hosts for this loop 49116 1727204715.40757: getting the next task for host managed-node3 49116 1727204715.40771: done getting next task for host managed-node3 49116 1727204715.40774: ^ task is: TASK: meta (flush_handlers) 49116 1727204715.40777: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204715.40784: getting variables 49116 1727204715.40786: in VariableManager get_vars() 49116 1727204715.40837: Calling all_inventory to load vars for managed-node3 49116 1727204715.40841: Calling groups_inventory to load vars for managed-node3 49116 1727204715.40843: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204715.40861: Calling all_plugins_play to load vars for managed-node3 49116 1727204715.40864: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204715.40972: Calling groups_plugins_play to load vars for managed-node3 49116 1727204715.41573: done sending task result for task 127b8e07-fff9-02f7-957b-000000000b18 49116 1727204715.41580: WORKER PROCESS EXITING 49116 1727204715.42268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204715.43963: done with get_vars() 49116 1727204715.44014: done getting variables 49116 1727204715.44077: in VariableManager get_vars() 49116 1727204715.44093: Calling all_inventory to load vars for managed-node3 49116 1727204715.44095: Calling groups_inventory to load vars for managed-node3 49116 1727204715.44097: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204715.44101: Calling all_plugins_play to load vars for managed-node3 49116 1727204715.44103: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204715.44104: Calling groups_plugins_play to load vars for managed-node3 49116 1727204715.45084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204715.46277: done with get_vars() 49116 1727204715.46311: done queuing things up, now waiting for results queue to drain 49116 1727204715.46313: results queue empty 49116 1727204715.46313: checking for any_errors_fatal 49116 1727204715.46316: done checking for any_errors_fatal 49116 1727204715.46316: checking for max_fail_percentage 49116 1727204715.46317: done checking for max_fail_percentage 49116 1727204715.46318: checking to see if all hosts have failed and the running result is not ok 49116 1727204715.46318: done checking to see if all hosts have failed 49116 1727204715.46319: getting the remaining hosts for this loop 49116 1727204715.46320: done getting the remaining hosts for this loop 49116 1727204715.46322: getting the next task for host managed-node3 49116 1727204715.46325: done getting next task for host managed-node3 49116 1727204715.46326: ^ task is: TASK: meta (flush_handlers) 49116 1727204715.46327: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204715.46329: getting variables 49116 1727204715.46330: in VariableManager get_vars() 49116 1727204715.46341: Calling all_inventory to load vars for managed-node3 49116 1727204715.46343: Calling groups_inventory to load vars for managed-node3 49116 1727204715.46344: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204715.46350: Calling all_plugins_play to load vars for managed-node3 49116 1727204715.46351: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204715.46353: Calling groups_plugins_play to load vars for managed-node3 49116 1727204715.47225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204715.48407: done with get_vars() 49116 1727204715.48438: done getting variables 49116 1727204715.48483: in VariableManager get_vars() 49116 1727204715.48496: Calling all_inventory to load vars for managed-node3 49116 1727204715.48498: Calling groups_inventory to load vars for managed-node3 49116 1727204715.48499: Calling all_plugins_inventory to load vars for managed-node3 49116 1727204715.48503: Calling all_plugins_play to load vars for managed-node3 49116 1727204715.48505: Calling groups_plugins_inventory to load vars for managed-node3 49116 1727204715.48506: Calling groups_plugins_play to load vars for managed-node3 49116 1727204715.49417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49116 1727204715.50616: done with get_vars() 49116 1727204715.50650: done queuing things up, now waiting for results queue to drain 49116 1727204715.50652: results queue empty 49116 1727204715.50653: checking for any_errors_fatal 49116 1727204715.50654: done checking for any_errors_fatal 49116 1727204715.50654: checking for max_fail_percentage 49116 1727204715.50655: done checking for max_fail_percentage 49116 1727204715.50656: checking to see if all hosts have failed and the running result is not ok 49116 1727204715.50656: done checking to see if all hosts have failed 49116 1727204715.50657: getting the remaining hosts for this loop 49116 1727204715.50657: done getting the remaining hosts for this loop 49116 1727204715.50667: getting the next task for host managed-node3 49116 1727204715.50670: done getting next task for host managed-node3 49116 1727204715.50671: ^ task is: None 49116 1727204715.50672: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49116 1727204715.50673: done queuing things up, now waiting for results queue to drain 49116 1727204715.50673: results queue empty 49116 1727204715.50674: checking for any_errors_fatal 49116 1727204715.50674: done checking for any_errors_fatal 49116 1727204715.50674: checking for max_fail_percentage 49116 1727204715.50675: done checking for max_fail_percentage 49116 1727204715.50676: checking to see if all hosts have failed and the running result is not ok 49116 1727204715.50676: done checking to see if all hosts have failed 49116 1727204715.50680: getting the next task for host managed-node3 49116 1727204715.50681: done getting next task for host managed-node3 49116 1727204715.50682: ^ task is: None 49116 1727204715.50683: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node3 : ok=78 changed=2 unreachable=0 failed=0 skipped=68 rescued=0 ignored=0 Tuesday 24 September 2024 15:05:15 -0400 (0:00:00.128) 0:00:38.532 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.86s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.80s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 2.10s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:6 Install iproute --------------------------------------------------------- 1.97s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which packages are installed --- 1.60s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Install iproute --------------------------------------------------------- 1.53s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Create veth interface lsr101 -------------------------------------------- 1.49s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Gathering Facts --------------------------------------------------------- 1.33s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.07s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gather current interface info ------------------------------------------- 1.05s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.98s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.94s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gather the minimum subset of ansible_facts required by the network role test --- 0.92s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.85s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.85s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Check if system is ostree ----------------------------------------------- 0.82s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.73s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Gather current interface info ------------------------------------------- 0.58s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Stat profile file ------------------------------------------------------- 0.58s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Gather current interface info ------------------------------------------- 0.50s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 49116 1727204715.50788: RUNNING CLEANUP