[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 33277 1726883055.73812: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-AQL executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 33277 1726883055.74966: Added group all to inventory 33277 1726883055.74968: Added group ungrouped to inventory 33277 1726883055.74973: Group all now contains ungrouped 33277 1726883055.74976: Examining possible inventory source: /tmp/network-mVt/inventory.yml 33277 1726883056.10016: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 33277 1726883056.10084: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 33277 1726883056.10115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 33277 1726883056.10201: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 33277 1726883056.10492: Loaded config def from plugin (inventory/script) 33277 1726883056.10495: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 33277 1726883056.10539: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 33277 1726883056.10843: Loaded config def from plugin (inventory/yaml) 33277 1726883056.10846: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 33277 1726883056.10938: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 33277 1726883056.12056: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 33277 1726883056.12060: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 33277 1726883056.12063: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 33277 1726883056.12069: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 33277 1726883056.12074: Loading data from /tmp/network-mVt/inventory.yml 33277 1726883056.12237: /tmp/network-mVt/inventory.yml was not parsable by auto 33277 1726883056.12421: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 33277 1726883056.12469: Loading data from /tmp/network-mVt/inventory.yml 33277 1726883056.12740: group all already in inventory 33277 1726883056.12747: set inventory_file for managed_node1 33277 1726883056.12756: set inventory_dir for managed_node1 33277 1726883056.12758: Added host managed_node1 to inventory 33277 1726883056.12760: Added host managed_node1 to group all 33277 1726883056.12762: set ansible_host for managed_node1 33277 1726883056.12763: set ansible_ssh_extra_args for managed_node1 33277 1726883056.12767: set inventory_file for managed_node2 33277 1726883056.12770: set inventory_dir for managed_node2 33277 1726883056.12771: Added host managed_node2 to inventory 33277 1726883056.12773: Added host managed_node2 to group all 33277 1726883056.12774: set ansible_host for managed_node2 33277 1726883056.12774: set ansible_ssh_extra_args for managed_node2 33277 1726883056.12777: set inventory_file for managed_node3 33277 1726883056.12780: set inventory_dir for managed_node3 33277 1726883056.12781: Added host managed_node3 to inventory 33277 1726883056.12782: Added host managed_node3 to group all 33277 1726883056.12783: set ansible_host for managed_node3 33277 1726883056.12783: set ansible_ssh_extra_args for managed_node3 33277 1726883056.12786: Reconcile groups and hosts in inventory. 33277 1726883056.12790: Group ungrouped now contains managed_node1 33277 1726883056.12792: Group ungrouped now contains managed_node2 33277 1726883056.12794: Group ungrouped now contains managed_node3 33277 1726883056.12997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 33277 1726883056.13311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 33277 1726883056.13369: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 33277 1726883056.13401: Loaded config def from plugin (vars/host_group_vars) 33277 1726883056.13403: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 33277 1726883056.13528: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 33277 1726883056.13537: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 33277 1726883056.13585: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 33277 1726883056.14332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883056.14546: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 33277 1726883056.14589: Loaded config def from plugin (connection/local) 33277 1726883056.14593: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 33277 1726883056.16208: Loaded config def from plugin (connection/paramiko_ssh) 33277 1726883056.16212: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 33277 1726883056.18328: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 33277 1726883056.18376: Loaded config def from plugin (connection/psrp) 33277 1726883056.18380: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 33277 1726883056.20228: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 33277 1726883056.20279: Loaded config def from plugin (connection/ssh) 33277 1726883056.20282: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 33277 1726883056.25960: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 33277 1726883056.26006: Loaded config def from plugin (connection/winrm) 33277 1726883056.26010: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 33277 1726883056.26050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 33277 1726883056.26199: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 33277 1726883056.26399: Loaded config def from plugin (shell/cmd) 33277 1726883056.26401: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 33277 1726883056.26435: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 33277 1726883056.26738: Loaded config def from plugin (shell/powershell) 33277 1726883056.26741: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 33277 1726883056.26803: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 33277 1726883056.27317: Loaded config def from plugin (shell/sh) 33277 1726883056.27320: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 33277 1726883056.27359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 33277 1726883056.27721: Loaded config def from plugin (become/runas) 33277 1726883056.27726: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 33277 1726883056.28180: Loaded config def from plugin (become/su) 33277 1726883056.28183: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 33277 1726883056.28695: Loaded config def from plugin (become/sudo) 33277 1726883056.28697: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 33277 1726883056.28741: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml 33277 1726883056.29655: in VariableManager get_vars() 33277 1726883056.29679: done with get_vars() 33277 1726883056.30031: trying /usr/local/lib/python3.12/site-packages/ansible/modules 33277 1726883056.38306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 33277 1726883056.38557: in VariableManager get_vars() 33277 1726883056.38562: done with get_vars() 33277 1726883056.38565: variable 'playbook_dir' from source: magic vars 33277 1726883056.38566: variable 'ansible_playbook_python' from source: magic vars 33277 1726883056.38567: variable 'ansible_config_file' from source: magic vars 33277 1726883056.38568: variable 'groups' from source: magic vars 33277 1726883056.38569: variable 'omit' from source: magic vars 33277 1726883056.38570: variable 'ansible_version' from source: magic vars 33277 1726883056.38570: variable 'ansible_check_mode' from source: magic vars 33277 1726883056.38571: variable 'ansible_diff_mode' from source: magic vars 33277 1726883056.38572: variable 'ansible_forks' from source: magic vars 33277 1726883056.38573: variable 'ansible_inventory_sources' from source: magic vars 33277 1726883056.38574: variable 'ansible_skip_tags' from source: magic vars 33277 1726883056.38575: variable 'ansible_limit' from source: magic vars 33277 1726883056.38575: variable 'ansible_run_tags' from source: magic vars 33277 1726883056.38576: variable 'ansible_verbosity' from source: magic vars 33277 1726883056.38734: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml 33277 1726883056.40699: in VariableManager get_vars() 33277 1726883056.40717: done with get_vars() 33277 1726883056.40959: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 33277 1726883056.41373: in VariableManager get_vars() 33277 1726883056.41389: done with get_vars() 33277 1726883056.41394: variable 'omit' from source: magic vars 33277 1726883056.41414: variable 'omit' from source: magic vars 33277 1726883056.41452: in VariableManager get_vars() 33277 1726883056.41462: done with get_vars() 33277 1726883056.41512: in VariableManager get_vars() 33277 1726883056.41831: done with get_vars() 33277 1726883056.41872: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 33277 1726883056.42225: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 33277 1726883056.42667: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 33277 1726883056.44403: in VariableManager get_vars() 33277 1726883056.44850: done with get_vars() 33277 1726883056.46135: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 33277 1726883056.46711: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 33277 1726883056.51958: in VariableManager get_vars() 33277 1726883056.51981: done with get_vars() 33277 1726883056.51990: variable 'omit' from source: magic vars 33277 1726883056.52002: variable 'omit' from source: magic vars 33277 1726883056.52043: in VariableManager get_vars() 33277 1726883056.52076: done with get_vars() 33277 1726883056.52101: in VariableManager get_vars() 33277 1726883056.52117: done with get_vars() 33277 1726883056.52470: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 33277 1726883056.52668: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 33277 1726883056.52764: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 33277 1726883056.59217: in VariableManager get_vars() 33277 1726883056.59338: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 33277 1726883056.64435: in VariableManager get_vars() 33277 1726883056.64466: done with get_vars() 33277 1726883056.64472: variable 'omit' from source: magic vars 33277 1726883056.64484: variable 'omit' from source: magic vars 33277 1726883056.64521: in VariableManager get_vars() 33277 1726883056.64541: done with get_vars() 33277 1726883056.64677: in VariableManager get_vars() 33277 1726883056.64698: done with get_vars() 33277 1726883056.64788: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 33277 1726883056.65041: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 33277 1726883056.65248: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 33277 1726883056.66258: in VariableManager get_vars() 33277 1726883056.66283: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 33277 1726883056.71117: in VariableManager get_vars() 33277 1726883056.71404: done with get_vars() 33277 1726883056.71411: variable 'omit' from source: magic vars 33277 1726883056.71442: variable 'omit' from source: magic vars 33277 1726883056.71599: in VariableManager get_vars() 33277 1726883056.71621: done with get_vars() 33277 1726883056.71647: in VariableManager get_vars() 33277 1726883056.71669: done with get_vars() 33277 1726883056.72137: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 33277 1726883056.72266: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 33277 1726883056.72674: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 33277 1726883056.74024: in VariableManager get_vars() 33277 1726883056.74056: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 33277 1726883056.77496: in VariableManager get_vars() 33277 1726883056.77531: done with get_vars() 33277 1726883056.77572: in VariableManager get_vars() 33277 1726883056.77596: done with get_vars() 33277 1726883056.77669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 33277 1726883056.77690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 33277 1726883056.77970: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 33277 1726883056.78167: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 33277 1726883056.78170: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-AQL/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 33277 1726883056.78205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 33277 1726883056.78278: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 33277 1726883056.78706: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 33277 1726883056.78775: Loaded config def from plugin (callback/default) 33277 1726883056.78777: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 33277 1726883056.80864: Loaded config def from plugin (callback/junit) 33277 1726883056.80867: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 33277 1726883056.80920: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 33277 1726883056.81001: Loaded config def from plugin (callback/minimal) 33277 1726883056.81004: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 33277 1726883056.81047: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 33277 1726883056.81118: Loaded config def from plugin (callback/tree) 33277 1726883056.81121: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 33277 1726883056.81254: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 33277 1726883056.81256: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-AQL/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_wireless_nm.yml ************************************************ 2 plays in /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml 33277 1726883056.81284: in VariableManager get_vars() 33277 1726883056.81298: done with get_vars() 33277 1726883056.81304: in VariableManager get_vars() 33277 1726883056.81313: done with get_vars() 33277 1726883056.81317: variable 'omit' from source: magic vars 33277 1726883056.81362: in VariableManager get_vars() 33277 1726883056.81376: done with get_vars() 33277 1726883056.81398: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_wireless.yml' with nm as provider] ********* 33277 1726883056.81986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 33277 1726883056.82089: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 33277 1726883056.82120: getting the remaining hosts for this loop 33277 1726883056.82125: done getting the remaining hosts for this loop 33277 1726883056.82128: getting the next task for host managed_node2 33277 1726883056.82131: done getting next task for host managed_node2 33277 1726883056.82133: ^ task is: TASK: Gathering Facts 33277 1726883056.82135: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883056.82137: getting variables 33277 1726883056.82138: in VariableManager get_vars() 33277 1726883056.82197: Calling all_inventory to load vars for managed_node2 33277 1726883056.82201: Calling groups_inventory to load vars for managed_node2 33277 1726883056.82204: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883056.82217: Calling all_plugins_play to load vars for managed_node2 33277 1726883056.82231: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883056.82235: Calling groups_plugins_play to load vars for managed_node2 33277 1726883056.82270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883056.82556: done with get_vars() 33277 1726883056.82563: done getting variables 33277 1726883056.82769: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:6 Friday 20 September 2024 21:44:16 -0400 (0:00:00.016) 0:00:00.016 ****** 33277 1726883056.82791: entering _queue_task() for managed_node2/gather_facts 33277 1726883056.82793: Creating lock for gather_facts 33277 1726883056.83556: worker is 1 (out of 1 available) 33277 1726883056.83568: exiting _queue_task() for managed_node2/gather_facts 33277 1726883056.83580: done queuing things up, now waiting for results queue to drain 33277 1726883056.83582: waiting for pending results... 33277 1726883056.83965: running TaskExecutor() for managed_node2/TASK: Gathering Facts 33277 1726883056.84151: in run() - task 0affc7ec-ae25-6628-6da4-000000000147 33277 1726883056.84167: variable 'ansible_search_path' from source: unknown 33277 1726883056.84208: calling self._execute() 33277 1726883056.84366: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883056.84372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883056.84382: variable 'omit' from source: magic vars 33277 1726883056.84649: variable 'omit' from source: magic vars 33277 1726883056.84680: variable 'omit' from source: magic vars 33277 1726883056.84767: variable 'omit' from source: magic vars 33277 1726883056.84947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33277 1726883056.84973: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33277 1726883056.85033: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33277 1726883056.85140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33277 1726883056.85148: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33277 1726883056.85227: variable 'inventory_hostname' from source: host vars for 'managed_node2' 33277 1726883056.85231: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883056.85233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883056.85418: Set connection var ansible_pipelining to False 33277 1726883056.85423: Set connection var ansible_connection to ssh 33277 1726883056.85430: Set connection var ansible_timeout to 10 33277 1726883056.85439: Set connection var ansible_shell_executable to /bin/sh 33277 1726883056.85442: Set connection var ansible_shell_type to sh 33277 1726883056.85533: Set connection var ansible_module_compression to ZIP_DEFLATED 33277 1726883056.85559: variable 'ansible_shell_executable' from source: unknown 33277 1726883056.85563: variable 'ansible_connection' from source: unknown 33277 1726883056.85592: variable 'ansible_module_compression' from source: unknown 33277 1726883056.85601: variable 'ansible_shell_type' from source: unknown 33277 1726883056.85603: variable 'ansible_shell_executable' from source: unknown 33277 1726883056.85606: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883056.85608: variable 'ansible_pipelining' from source: unknown 33277 1726883056.85611: variable 'ansible_timeout' from source: unknown 33277 1726883056.85613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883056.86007: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 33277 1726883056.86012: variable 'omit' from source: magic vars 33277 1726883056.86014: starting attempt loop 33277 1726883056.86017: running the handler 33277 1726883056.86225: variable 'ansible_facts' from source: unknown 33277 1726883056.86234: _low_level_execute_command(): starting 33277 1726883056.86238: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33277 1726883056.87598: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33277 1726883056.87607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33277 1726883056.87611: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33277 1726883056.87614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33277 1726883056.87796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK <<< 33277 1726883056.87803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33277 1726883056.87897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33277 1726883056.89746: stdout chunk (state=3): >>>/root <<< 33277 1726883056.89750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33277 1726883056.89885: stderr chunk (state=3): >>><<< 33277 1726883056.89891: stdout chunk (state=3): >>><<< 33277 1726883056.90028: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33277 1726883056.90032: _low_level_execute_command(): starting 33277 1726883056.90035: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883056.899129-33328-230731285251255 `" && echo ansible-tmp-1726883056.899129-33328-230731285251255="` echo /root/.ansible/tmp/ansible-tmp-1726883056.899129-33328-230731285251255 `" ) && sleep 0' 33277 1726883056.91105: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33277 1726883056.91109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found <<< 33277 1726883056.91112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 33277 1726883056.91114: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33277 1726883056.91126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33277 1726883056.91436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK <<< 33277 1726883056.91454: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33277 1726883056.91551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33277 1726883056.93570: stdout chunk (state=3): >>>ansible-tmp-1726883056.899129-33328-230731285251255=/root/.ansible/tmp/ansible-tmp-1726883056.899129-33328-230731285251255 <<< 33277 1726883056.93999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33277 1726883056.94003: stdout chunk (state=3): >>><<< 33277 1726883056.94005: stderr chunk (state=3): >>><<< 33277 1726883056.94008: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883056.899129-33328-230731285251255=/root/.ansible/tmp/ansible-tmp-1726883056.899129-33328-230731285251255 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33277 1726883056.94039: variable 'ansible_module_compression' from source: unknown 33277 1726883056.94328: ANSIBALLZ: Using generic lock for ansible.legacy.setup 33277 1726883056.94331: ANSIBALLZ: Acquiring lock 33277 1726883056.94333: ANSIBALLZ: Lock acquired: 140085462455344 33277 1726883056.94335: ANSIBALLZ: Creating module 33277 1726883057.30650: ANSIBALLZ: Writing module into payload 33277 1726883057.30819: ANSIBALLZ: Writing module 33277 1726883057.30855: ANSIBALLZ: Renaming module 33277 1726883057.30867: ANSIBALLZ: Done creating module 33277 1726883057.30910: variable 'ansible_facts' from source: unknown 33277 1726883057.30926: variable 'inventory_hostname' from source: host vars for 'managed_node2' 33277 1726883057.30941: _low_level_execute_command(): starting 33277 1726883057.30952: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 33277 1726883057.31494: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33277 1726883057.31510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 33277 1726883057.31535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33277 1726883057.31569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' <<< 33277 1726883057.31582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33277 1726883057.31658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 33277 1726883057.34124: stdout chunk (state=3): >>>PLATFORM <<< 33277 1726883057.34242: stdout chunk (state=3): >>>Linux <<< 33277 1726883057.34296: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 33277 1726883057.34474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33277 1726883057.34531: stderr chunk (state=3): >>><<< 33277 1726883057.34538: stdout chunk (state=3): >>><<< 33277 1726883057.34557: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 33277 1726883057.34568 [managed_node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 33277 1726883057.34608: _low_level_execute_command(): starting 33277 1726883057.34611: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 33277 1726883057.34699: Sending initial data 33277 1726883057.34703: Sent initial data (1181 bytes) 33277 1726883057.35094: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33277 1726883057.35097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found <<< 33277 1726883057.35099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 33277 1726883057.35102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found <<< 33277 1726883057.35109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33277 1726883057.35160: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' <<< 33277 1726883057.35167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33277 1726883057.35169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33277 1726883057.35213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 33277 1726883057.40470: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} <<< 33277 1726883057.41093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33277 1726883057.41154: stderr chunk (state=3): >>><<< 33277 1726883057.41157: stdout chunk (state=3): >>><<< 33277 1726883057.41170: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 33277 1726883057.41246: variable 'ansible_facts' from source: unknown 33277 1726883057.41252: variable 'ansible_facts' from source: unknown 33277 1726883057.41260: variable 'ansible_module_compression' from source: unknown 33277 1726883057.41294: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33277prfh61zr/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 33277 1726883057.41317: variable 'ansible_facts' from source: unknown 33277 1726883057.41451: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883056.899129-33328-230731285251255/AnsiballZ_setup.py 33277 1726883057.41567: Sending initial data 33277 1726883057.41571: Sent initial data (153 bytes) 33277 1726883057.42067: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33277 1726883057.42071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found <<< 33277 1726883057.42073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 33277 1726883057.42075: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33277 1726883057.42077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33277 1726883057.42134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' <<< 33277 1726883057.42142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33277 1726883057.42190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 33277 1726883057.44113: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 33277 1726883057.44158: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 33277 1726883057.44209: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33277prfh61zr/tmp1wzwmy1d /root/.ansible/tmp/ansible-tmp-1726883056.899129-33328-230731285251255/AnsiballZ_setup.py <<< 33277 1726883057.44214: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883056.899129-33328-230731285251255/AnsiballZ_setup.py" <<< 33277 1726883057.44257: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-33277prfh61zr/tmp1wzwmy1d" to remote "/root/.ansible/tmp/ansible-tmp-1726883056.899129-33328-230731285251255/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883056.899129-33328-230731285251255/AnsiballZ_setup.py" <<< 33277 1726883057.45348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33277 1726883057.45430: stderr chunk (state=3): >>><<< 33277 1726883057.45435: stdout chunk (state=3): >>><<< 33277 1726883057.45466: done transferring module to remote 33277 1726883057.45483: _low_level_execute_command(): starting 33277 1726883057.45488: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883056.899129-33328-230731285251255/ /root/.ansible/tmp/ansible-tmp-1726883056.899129-33328-230731285251255/AnsiballZ_setup.py && sleep 0' 33277 1726883057.46212: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33277 1726883057.46236: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33277 1726883057.46255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33277 1726883057.46275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33277 1726883057.46291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 <<< 33277 1726883057.46303: stderr chunk (state=3): >>>debug2: match not found <<< 33277 1726883057.46338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found <<< 33277 1726883057.46430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33277 1726883057.46445: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' <<< 33277 1726883057.46486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33277 1726883057.46595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33277 1726883057.48988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33277 1726883057.49073: stderr chunk (state=3): >>><<< 33277 1726883057.49126: stdout chunk (state=3): >>><<< 33277 1726883057.49338: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33277 1726883057.49342: _low_level_execute_command(): starting 33277 1726883057.49345: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883056.899129-33328-230731285251255/AnsiballZ_setup.py && sleep 0' 33277 1726883057.50821: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33277 1726883057.50838: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK <<< 33277 1726883057.50853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33277 1726883057.51001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33277 1726883057.53663: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # <<< 33277 1726883057.53780: stdout chunk (state=3): >>># installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02df4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02dc3b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02df6ab0> import '_signal' # <<< 33277 1726883057.53802: stdout chunk (state=3): >>>import '_abc' # <<< 33277 1726883057.53912: stdout chunk (state=3): >>>import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 33277 1726883057.54105: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 33277 1726883057.54129: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 33277 1726883057.54141: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02e051c0> <<< 33277 1726883057.54249: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02e05fd0> <<< 33277 1726883057.54252: stdout chunk (state=3): >>>import 'site' # <<< 33277 1726883057.54282: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 33277 1726883057.54785: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 33277 1726883057.55247: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02be3dd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02be3fe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c1b800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c1be90> import '_collections' # <<< 33277 1726883057.55250: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02bfbaa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02bf91c0> <<< 33277 1726883057.55443: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02be0f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 33277 1726883057.55450: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 33277 1726883057.55638: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c3f770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c3e390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02bfa060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c3cbc0> <<< 33277 1726883057.55748: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c6c7a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02be0200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02c6cc50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c6cb00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883057.55754: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02c6cef0> <<< 33277 1726883057.55766: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02bded20> <<< 33277 1726883057.55789: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 33277 1726883057.55810: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 33277 1726883057.56145: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c6d5e0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c6d2b0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c6e4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c886e0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02c89e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 33277 1726883057.56175: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 33277 1726883057.56178: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c8ac90> <<< 33277 1726883057.56447: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02c8b2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c8a210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02c8bd40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c8b4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c6e510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 33277 1726883057.56452: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 33277 1726883057.56465: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 33277 1726883057.56645: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02987c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc029b07a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc029b0500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc029b07d0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc029b09b0> <<< 33277 1726883057.56649: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02985df0> <<< 33277 1726883057.56663: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 33277 1726883057.56764: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 33277 1726883057.56792: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 33277 1726883057.56855: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc029b2090> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc029b0d40> <<< 33277 1726883057.56870: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c6ec00> <<< 33277 1726883057.56938: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 33277 1726883057.56963: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 33277 1726883057.57001: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 33277 1726883057.57031: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc029da450> <<< 33277 1726883057.57087: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 33277 1726883057.57107: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 33277 1726883057.57133: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 33277 1726883057.57149: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 33277 1726883057.57241: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc029f65a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 33277 1726883057.57262: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 33277 1726883057.57307: stdout chunk (state=3): >>>import 'ntpath' # <<< 33277 1726883057.57339: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02a2b320> <<< 33277 1726883057.57362: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 33277 1726883057.57399: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 33277 1726883057.57433: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 33277 1726883057.57471: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 33277 1726883057.57631: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02a51a90> <<< 33277 1726883057.57650: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02a2b440> <<< 33277 1726883057.57689: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc029f7230> <<< 33277 1726883057.57748: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02870410> <<< 33277 1726883057.57755: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc029f55e0> <<< 33277 1726883057.57836: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc029b2ff0> <<< 33277 1726883057.57908: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 33277 1726883057.57939: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fdc029f5370> <<< 33277 1726883057.58110: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_uwo3sf93/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 33277 1726883057.58269: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.58298: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 33277 1726883057.58311: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 33277 1726883057.58349: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 33277 1726883057.58424: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 33277 1726883057.58462: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc028da120> <<< 33277 1726883057.58475: stdout chunk (state=3): >>>import '_typing' # <<< 33277 1726883057.58732: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc028b1010> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc028b01a0> <<< 33277 1726883057.58735: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.58813: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available <<< 33277 1726883057.58816: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 33277 1726883057.58830: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.60354: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.61905: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc028b3fb0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02909ac0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02909850> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02909160> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02909640> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc028dadb0> import 'atexit' # <<< 33277 1726883057.61935: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc0290a840> <<< 33277 1726883057.61970: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc0290aa80> <<< 33277 1726883057.62118: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc0290af30> import 'pwd' # <<< 33277 1726883057.62137: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 33277 1726883057.62162: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 33277 1726883057.62195: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02770c80> <<< 33277 1726883057.62229: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc027728a0> <<< 33277 1726883057.62257: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 33277 1726883057.62273: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 33277 1726883057.62343: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02773260> <<< 33277 1726883057.62358: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 33277 1726883057.62387: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02774440> <<< 33277 1726883057.62442: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 33277 1726883057.62464: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 33277 1726883057.62569: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02776ed0> <<< 33277 1726883057.62576: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02777200> <<< 33277 1726883057.62601: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02775190> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 33277 1726883057.62628: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 33277 1726883057.62751: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 33277 1726883057.62754: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 33277 1726883057.62788: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc0277aea0> import '_tokenize' # <<< 33277 1726883057.62861: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02779970> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027796d0> <<< 33277 1726883057.62870: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 33277 1726883057.62885: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 33277 1726883057.62973: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc0277be60> <<< 33277 1726883057.63091: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027756a0> <<< 33277 1726883057.63121: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc027bf0b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027bf1d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 33277 1726883057.63149: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc027c0da0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027c0b60> <<< 33277 1726883057.63169: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 33277 1726883057.63347: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc027c32f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027c1490> <<< 33277 1726883057.63372: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 33277 1726883057.63414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 33277 1726883057.63529: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 33277 1726883057.63549: stdout chunk (state=3): >>>import '_string' # <<< 33277 1726883057.63569: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027cea50> <<< 33277 1726883057.63638: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027c33e0> <<< 33277 1726883057.63717: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc027cfd40> <<< 33277 1726883057.63758: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc027cf770> <<< 33277 1726883057.63876: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883057.63897: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc027cfd70> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027bf470> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 33277 1726883057.63925: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883057.63949: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc027d34d0> <<< 33277 1726883057.64135: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc027d46e0> <<< 33277 1726883057.64428: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027d1c40> <<< 33277 1726883057.64432: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc027d2fc0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027d1820> # zipimport: zlib available <<< 33277 1726883057.64457: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 33277 1726883057.64497: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 33277 1726883057.64632: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.64761: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.65437: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.66129: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc0265c7d0> <<< 33277 1726883057.66627: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc0265d4f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027d08c0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 33277 1726883057.66736: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc0265d520> # zipimport: zlib available <<< 33277 1726883057.67249: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.67750: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.67921: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.67926: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 33277 1726883057.67929: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.68056: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 33277 1726883057.68087: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.68226: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 33277 1726883057.68239: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.68277: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.68383: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 33277 1726883057.68635: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.68931: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 33277 1726883057.69256: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc0265f110> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 33277 1726883057.69545: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02666150> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02666ab0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc0265f1a0> # zipimport: zlib available <<< 33277 1726883057.69577: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.69629: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 33277 1726883057.69632: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.69761: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 33277 1726883057.69787: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.69867: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 33277 1726883057.69937: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 33277 1726883057.70145: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc026657c0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02666cc0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 33277 1726883057.70158: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.70229: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.70254: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.70307: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 33277 1726883057.70331: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 33277 1726883057.70366: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 33277 1726883057.70437: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 33277 1726883057.70474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 33277 1726883057.70478: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 33277 1726883057.70481: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 33277 1726883057.70540: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc026facc0> <<< 33277 1726883057.70583: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02670b00> <<< 33277 1726883057.70675: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc0266ab10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc0266a960> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 33277 1726883057.70697: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.70721: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.70754: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 33277 1726883057.70817: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 33277 1726883057.70863: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 33277 1726883057.70969: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.70994: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.71016: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.71074: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.71096: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.71131: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.71184: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.71213: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 33277 1726883057.71383: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.71387: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.71409: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.71420: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.71452: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 33277 1726883057.71463: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.71657: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.71848: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.71959: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.71998: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 33277 1726883057.72046: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 33277 1726883057.72084: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 33277 1726883057.72087: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 33277 1726883057.72123: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 33277 1726883057.72127: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02701a30> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 33277 1726883057.72172: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 33277 1726883057.72175: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 33277 1726883057.72265: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 33277 1726883057.72277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02144260> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc021445c0> <<< 33277 1726883057.72327: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc026e12e0> <<< 33277 1726883057.72411: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc026e0320> <<< 33277 1726883057.72436: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02700110> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02703b00> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 33277 1726883057.72469: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 33277 1726883057.72496: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 33277 1726883057.72560: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 33277 1726883057.72599: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02147650> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02146f00> <<< 33277 1726883057.72603: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc021470e0> <<< 33277 1726883057.72681: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02146330> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 33277 1726883057.72759: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 33277 1726883057.72795: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02147740> <<< 33277 1726883057.72827: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 33277 1726883057.72887: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc021b2210> <<< 33277 1726883057.72928: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027d1ee0> <<< 33277 1726883057.72932: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027011f0> import 'ansible.module_utils.facts.timeout' # <<< 33277 1726883057.73017: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 33277 1726883057.73020: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 33277 1726883057.73024: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.73130: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.73133: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 33277 1726883057.73172: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.73358: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 33277 1726883057.73362: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available <<< 33277 1726883057.73375: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 33277 1726883057.73406: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.73467: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 33277 1726883057.73519: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.73581: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 33277 1726883057.73673: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.73701: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.73762: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.73829: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 33277 1726883057.73896: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.74411: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.74914: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 33277 1726883057.74979: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.74992: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.75047: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.75111: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.75165: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 33277 1726883057.75177: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 33277 1726883057.75284: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available <<< 33277 1726883057.75326: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 33277 1726883057.75350: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.75374: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.75425: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 33277 1726883057.75428: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.75456: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.75518: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 33277 1726883057.75527: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.75599: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.75728: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 33277 1726883057.75751: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc021b24e0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 33277 1726883057.75860: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 33277 1726883057.75913: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc021b3140> import 'ansible.module_utils.facts.system.local' # <<< 33277 1726883057.75932: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.76001: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.76087: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 33277 1726883057.76188: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.76297: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 33277 1726883057.76313: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.76430: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.76751: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 33277 1726883057.76754: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883057.76756: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc021de5d0> <<< 33277 1726883057.76975: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc021cb050> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 33277 1726883057.77034: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.77096: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 33277 1726883057.77176: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.77202: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.77282: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.77421: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.77604: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 33277 1726883057.77640: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.77696: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 33277 1726883057.77699: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.77807: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.77815: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 33277 1726883057.77878: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc01b05b50> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc01b05760> <<< 33277 1726883057.78033: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 33277 1726883057.78036: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.78058: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 33277 1726883057.78178: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.78373: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 33277 1726883057.78482: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.78581: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.78629: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.78683: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 33277 1726883057.78803: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 33277 1726883057.78898: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.79052: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 33277 1726883057.79119: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.79566: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 33277 1726883057.80358: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.81164: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 33277 1726883057.81186: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.hardware.hurd' # <<< 33277 1726883057.81269: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.81496: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.81625: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 33277 1726883057.81641: stdout chunk (state=3): >>> # zipimport: zlib available<<< 33277 1726883057.81752: stdout chunk (state=3): >>> <<< 33277 1726883057.81832: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883057.81854: stdout chunk (state=3): >>> <<< 33277 1726883057.82132: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 33277 1726883057.82153: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.82336: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.83166: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available <<< 33277 1726883057.83637: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 33277 1726883057.83778: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 33277 1726883057.83910: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # <<< 33277 1726883057.84006: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.84045: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.84124: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 33277 1726883057.84445: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.84748: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 33277 1726883057.84784: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.84856: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 33277 1726883057.84900: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.84931: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 33277 1726883057.84956: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.84986: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.85055: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 33277 1726883057.85106: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 33277 1726883057.85426: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # <<< 33277 1726883057.85458: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 33277 1726883057.85480: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.85575: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.85589: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.85659: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.85807: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 33277 1726883057.85825: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.85869: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 33277 1726883057.85891: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.86098: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.86542: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 33277 1726883057.86638: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.86768: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 33277 1726883057.87061: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 33277 1726883057.87124: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883057.88947: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 33277 1726883057.89078: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 33277 1726883057.89159: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc01b2e5d0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc01b2ca40> <<< 33277 1726883057.89213: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027d0560> <<< 33277 1726883059.49746: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 33277 1726883059.49964: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc01b74e30> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc01b75ca0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 33277 1726883059.50029: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc021e4350> <<< 33277 1726883059.50044: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc01b779e0> <<< 33277 1726883059.50642: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 33277 1726883059.70895: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-13-201.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-13-201", "ansible_nodename": "ip-10-31-13-201.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec271e862ced9ef36c7d9c93e54dc434", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC4W6DnNxvXYP35ucYbXw0Rl8U5i76+nSxNOU9FlzmsJfP+uzNhFbvB4JNXnwJkgLimlQx7Nu4tUfylwwUKU5RSOnfX7+XU/7U5N+ASiAKnaE1eM8bey+vKw9yUCMWqMP2JJIgbUfrj1fualRuP7TrWzyiaD45ZlzS8WUIPQfUcjJeKBuBpKm2txHt8z07reCn9Fo3J0MgPpZzqYyBtz5cZQnqf00a57ZIS+In/5ZiOM6vvUsdnOcOJGDxnyvRpRnI/sIkkY1r225c9v45LCL1yhDWwDf5R1XcreHVgFvphaGxscm73CzunaAx07tOElGh9BdFCrRFyxdmW1+ZtrQ3PMZ09fRbdch7zE5b4TZkzJbvzN7gcJ20YhE+rMmJaOo/JHUip77V84gKyvbg1sSNgYkgUatYc4ak/dpXrGmdz5cTjowJXnle1DjXdDs/awxg1674TWMgDTcHLLj1RZ9NE6IoHTPzIBcMTgzpJlnV9v978N3Ar/pLxkGAPT8Q0+f8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDTe5a484DuRaUfRyVR9WiLfG+w2SIuQ3XCHSggW57gjmGhOPH7dR2w1D1xTofL2l7g+iaW6X0H/koP81LSjWMM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAILH/nH8SHxMjAzlrA3ts+XxnIQkq1Q/jggpukWw+sAXV", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 54870 10.31.13.201 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 54870 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS":<<< 33277 1726883059.70979: stdout chunk (state=3): >>> "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 0.64794921875, "5m": 0.53271484375, "15m": 0.35595703125}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:cc:08:73", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.201", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fecc:873", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.201", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:cc:08:73", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.201"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fecc:873"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.201", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fecc:873"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3020, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 696, "free": 3020}, "nocache": {"free": 3446, "used": 270}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec271e86-2ced-9ef3-6c7d-9c93e54dc434", "ansible_product_uuid": "ec271e86-2ced-9ef3-6c7d-9c93e54dc434", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1017, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251366965248, "block_size": 4096, "block_total": 64483404, "block_available": 61368888, "block_used": 3114516, "inode_total": 16384000, "inode_available": 16302591, "inode_used": 81409, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "44", "second": "19", "epoch": "1726883059", "epoch_int": "1726883059", "date": "2024-09-20", "time": "21:44:19", "iso8601_micro": "2024-09-21T01:44:19.703251Z", "iso8601": "2024-09-21T01:44:19Z", "iso8601_basic": "20240920T214419703251", "iso8601_basic_short": "20240920T214419", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 33277 1726883059.71897: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys<<< 33277 1726883059.71980: stdout chunk (state=3): >>> # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random <<< 33277 1726883059.72026: stdout chunk (state=3): >>># cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse<<< 33277 1726883059.72102: stdout chunk (state=3): >>> # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil<<< 33277 1726883059.72132: stdout chunk (state=3): >>> # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token<<< 33277 1726883059.72197: stdout chunk (state=3): >>> # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader<<< 33277 1726883059.72208: stdout chunk (state=3): >>> # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters<<< 33277 1726883059.72289: stdout chunk (state=3): >>> # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters<<< 33277 1726883059.72317: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext <<< 33277 1726883059.72565: stdout chunk (state=3): >>># cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace<<< 33277 1726883059.72568: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi<<< 33277 1726883059.72678: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 33277 1726883059.73093: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 33277 1726883059.73129: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 33277 1726883059.73151: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 33277 1726883059.73197: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 33277 1726883059.73214: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 33277 1726883059.73269: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 33277 1726883059.73461: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 33277 1726883059.73531: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal <<< 33277 1726883059.73551: stdout chunk (state=3): >>># destroy pickle # destroy _compat_pickle <<< 33277 1726883059.73577: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 33277 1726883059.73650: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime <<< 33277 1726883059.73674: stdout chunk (state=3): >>># destroy subprocess # destroy base64 # destroy _ssl <<< 33277 1726883059.73850: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 33277 1726883059.73890: stdout chunk (state=3): >>># destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves<<< 33277 1726883059.73924: stdout chunk (state=3): >>> # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid <<< 33277 1726883059.73946: stdout chunk (state=3): >>># cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 33277 1726883059.74130: stdout chunk (state=3): >>># destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 33277 1726883059.74136: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 33277 1726883059.74159: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 <<< 33277 1726883059.74296: stdout chunk (state=3): >>># destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 33277 1726883059.74379: stdout chunk (state=3): >>># destroy sys.monitoring <<< 33277 1726883059.74413: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 33277 1726883059.74526: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath <<< 33277 1726883059.74553: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 33277 1726883059.74576: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 33277 1726883059.74598: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 33277 1726883059.74735: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 <<< 33277 1726883059.74765: stdout chunk (state=3): >>># destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 33277 1726883059.74857: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re <<< 33277 1726883059.74952: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 33277 1726883059.75624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.201 closed. <<< 33277 1726883059.75628: stdout chunk (state=3): >>><<< 33277 1726883059.75638: stderr chunk (state=3): >>><<< 33277 1726883059.76046: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02df4530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02dc3b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02df6ab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02e051c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02e05fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02be3dd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02be3fe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c1b800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c1be90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02bfbaa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02bf91c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02be0f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c3f770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c3e390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02bfa060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c3cbc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c6c7a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02be0200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02c6cc50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c6cb00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02c6cef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02bded20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c6d5e0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c6d2b0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c6e4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c886e0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02c89e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c8ac90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02c8b2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c8a210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02c8bd40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c8b4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c6e510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02987c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc029b07a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc029b0500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc029b07d0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc029b09b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02985df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc029b2090> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc029b0d40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02c6ec00> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc029da450> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc029f65a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02a2b320> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02a51a90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02a2b440> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc029f7230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02870410> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc029f55e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc029b2ff0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fdc029f5370> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_uwo3sf93/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc028da120> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc028b1010> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc028b01a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc028b3fb0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02909ac0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02909850> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02909160> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02909640> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc028dadb0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc0290a840> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc0290aa80> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc0290af30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02770c80> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc027728a0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02773260> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02774440> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02776ed0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02777200> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02775190> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc0277aea0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02779970> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027796d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc0277be60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027756a0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc027bf0b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027bf1d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc027c0da0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027c0b60> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc027c32f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027c1490> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027cea50> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027c33e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc027cfd40> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc027cf770> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc027cfd70> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027bf470> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc027d34d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc027d46e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027d1c40> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc027d2fc0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027d1820> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc0265c7d0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc0265d4f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027d08c0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc0265d520> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc0265f110> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02666150> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02666ab0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc0265f1a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc026657c0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02666cc0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc026facc0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02670b00> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc0266ab10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc0266a960> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02701a30> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02144260> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc021445c0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc026e12e0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc026e0320> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02700110> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02703b00> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc02147650> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02146f00> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc021470e0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02146330> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc02147740> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc021b2210> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027d1ee0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027011f0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc021b24e0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc021b3140> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc021de5d0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc021cb050> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc01b05b50> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc01b05760> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdc01b2e5d0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc01b2ca40> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc027d0560> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc01b74e30> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc01b75ca0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc021e4350> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdc01b779e0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-13-201.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-13-201", "ansible_nodename": "ip-10-31-13-201.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec271e862ced9ef36c7d9c93e54dc434", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC4W6DnNxvXYP35ucYbXw0Rl8U5i76+nSxNOU9FlzmsJfP+uzNhFbvB4JNXnwJkgLimlQx7Nu4tUfylwwUKU5RSOnfX7+XU/7U5N+ASiAKnaE1eM8bey+vKw9yUCMWqMP2JJIgbUfrj1fualRuP7TrWzyiaD45ZlzS8WUIPQfUcjJeKBuBpKm2txHt8z07reCn9Fo3J0MgPpZzqYyBtz5cZQnqf00a57ZIS+In/5ZiOM6vvUsdnOcOJGDxnyvRpRnI/sIkkY1r225c9v45LCL1yhDWwDf5R1XcreHVgFvphaGxscm73CzunaAx07tOElGh9BdFCrRFyxdmW1+ZtrQ3PMZ09fRbdch7zE5b4TZkzJbvzN7gcJ20YhE+rMmJaOo/JHUip77V84gKyvbg1sSNgYkgUatYc4ak/dpXrGmdz5cTjowJXnle1DjXdDs/awxg1674TWMgDTcHLLj1RZ9NE6IoHTPzIBcMTgzpJlnV9v978N3Ar/pLxkGAPT8Q0+f8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDTe5a484DuRaUfRyVR9WiLfG+w2SIuQ3XCHSggW57gjmGhOPH7dR2w1D1xTofL2l7g+iaW6X0H/koP81LSjWMM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAILH/nH8SHxMjAzlrA3ts+XxnIQkq1Q/jggpukWw+sAXV", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 54870 10.31.13.201 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 54870 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 0.64794921875, "5m": 0.53271484375, "15m": 0.35595703125}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:cc:08:73", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.201", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fecc:873", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.201", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:cc:08:73", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.201"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fecc:873"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.201", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fecc:873"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3020, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 696, "free": 3020}, "nocache": {"free": 3446, "used": 270}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec271e86-2ced-9ef3-6c7d-9c93e54dc434", "ansible_product_uuid": "ec271e86-2ced-9ef3-6c7d-9c93e54dc434", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1017, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251366965248, "block_size": 4096, "block_total": 64483404, "block_available": 61368888, "block_used": 3114516, "inode_total": 16384000, "inode_available": 16302591, "inode_used": 81409, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "44", "second": "19", "epoch": "1726883059", "epoch_int": "1726883059", "date": "2024-09-20", "time": "21:44:19", "iso8601_micro": "2024-09-21T01:44:19.703251Z", "iso8601": "2024-09-21T01:44:19Z", "iso8601_basic": "20240920T214419703251", "iso8601_basic_short": "20240920T214419", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.201 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 33277 1726883059.77840: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883056.899129-33328-230731285251255/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33277 1726883059.77844: _low_level_execute_command(): starting 33277 1726883059.77846: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883056.899129-33328-230731285251255/ > /dev/null 2>&1 && sleep 0' 33277 1726883059.78440: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33277 1726883059.78735: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK <<< 33277 1726883059.78747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33277 1726883059.78838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33277 1726883059.81625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33277 1726883059.81629: stderr chunk (state=3): >>><<< 33277 1726883059.81634: stdout chunk (state=3): >>><<< 33277 1726883059.81871: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33277 1726883059.81875: handler run complete 33277 1726883059.82026: variable 'ansible_facts' from source: unknown 33277 1726883059.82315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883059.83003: variable 'ansible_facts' from source: unknown 33277 1726883059.83177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883059.83598: attempt loop complete, returning result 33277 1726883059.83654: _execute() done 33277 1726883059.83662: dumping result to json 33277 1726883059.83755: done dumping result, returning 33277 1726883059.83758: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affc7ec-ae25-6628-6da4-000000000147] 33277 1726883059.83761: sending task result for task 0affc7ec-ae25-6628-6da4-000000000147 33277 1726883059.84749: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000147 33277 1726883059.84753: WORKER PROCESS EXITING ok: [managed_node2] 33277 1726883059.85826: no more pending results, returning what we have 33277 1726883059.85833: results queue empty 33277 1726883059.85834: checking for any_errors_fatal 33277 1726883059.85836: done checking for any_errors_fatal 33277 1726883059.85836: checking for max_fail_percentage 33277 1726883059.85838: done checking for max_fail_percentage 33277 1726883059.85839: checking to see if all hosts have failed and the running result is not ok 33277 1726883059.85840: done checking to see if all hosts have failed 33277 1726883059.85841: getting the remaining hosts for this loop 33277 1726883059.85842: done getting the remaining hosts for this loop 33277 1726883059.85846: getting the next task for host managed_node2 33277 1726883059.85852: done getting next task for host managed_node2 33277 1726883059.85854: ^ task is: TASK: meta (flush_handlers) 33277 1726883059.85856: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883059.85860: getting variables 33277 1726883059.85862: in VariableManager get_vars() 33277 1726883059.85884: Calling all_inventory to load vars for managed_node2 33277 1726883059.85886: Calling groups_inventory to load vars for managed_node2 33277 1726883059.85890: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883059.85903: Calling all_plugins_play to load vars for managed_node2 33277 1726883059.85906: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883059.85909: Calling groups_plugins_play to load vars for managed_node2 33277 1726883059.86325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883059.86907: done with get_vars() 33277 1726883059.86919: done getting variables 33277 1726883059.87130: in VariableManager get_vars() 33277 1726883059.87142: Calling all_inventory to load vars for managed_node2 33277 1726883059.87144: Calling groups_inventory to load vars for managed_node2 33277 1726883059.87147: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883059.87152: Calling all_plugins_play to load vars for managed_node2 33277 1726883059.87155: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883059.87158: Calling groups_plugins_play to load vars for managed_node2 33277 1726883059.87477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883059.87710: done with get_vars() 33277 1726883059.87728: done queuing things up, now waiting for results queue to drain 33277 1726883059.87730: results queue empty 33277 1726883059.87731: checking for any_errors_fatal 33277 1726883059.87735: done checking for any_errors_fatal 33277 1726883059.87735: checking for max_fail_percentage 33277 1726883059.87736: done checking for max_fail_percentage 33277 1726883059.87737: checking to see if all hosts have failed and the running result is not ok 33277 1726883059.87738: done checking to see if all hosts have failed 33277 1726883059.87742: getting the remaining hosts for this loop 33277 1726883059.87743: done getting the remaining hosts for this loop 33277 1726883059.87746: getting the next task for host managed_node2 33277 1726883059.87752: done getting next task for host managed_node2 33277 1726883059.87754: ^ task is: TASK: Include the task 'el_repo_setup.yml' 33277 1726883059.87756: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883059.87758: getting variables 33277 1726883059.87759: in VariableManager get_vars() 33277 1726883059.87773: Calling all_inventory to load vars for managed_node2 33277 1726883059.87776: Calling groups_inventory to load vars for managed_node2 33277 1726883059.87778: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883059.87783: Calling all_plugins_play to load vars for managed_node2 33277 1726883059.87785: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883059.87788: Calling groups_plugins_play to load vars for managed_node2 33277 1726883059.87997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883059.88254: done with get_vars() 33277 1726883059.88262: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:11 Friday 20 September 2024 21:44:19 -0400 (0:00:03.055) 0:00:03.072 ****** 33277 1726883059.88379: entering _queue_task() for managed_node2/include_tasks 33277 1726883059.88382: Creating lock for include_tasks 33277 1726883059.88788: worker is 1 (out of 1 available) 33277 1726883059.88801: exiting _queue_task() for managed_node2/include_tasks 33277 1726883059.88867: done queuing things up, now waiting for results queue to drain 33277 1726883059.88870: waiting for pending results... 33277 1726883059.89059: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 33277 1726883059.89179: in run() - task 0affc7ec-ae25-6628-6da4-000000000006 33277 1726883059.89234: variable 'ansible_search_path' from source: unknown 33277 1726883059.89310: calling self._execute() 33277 1726883059.89366: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883059.89378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883059.89393: variable 'omit' from source: magic vars 33277 1726883059.89539: _execute() done 33277 1726883059.89547: dumping result to json 33277 1726883059.89554: done dumping result, returning 33277 1726883059.89580: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [0affc7ec-ae25-6628-6da4-000000000006] 33277 1726883059.89594: sending task result for task 0affc7ec-ae25-6628-6da4-000000000006 33277 1726883059.89936: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000006 33277 1726883059.89939: WORKER PROCESS EXITING 33277 1726883059.90004: no more pending results, returning what we have 33277 1726883059.90009: in VariableManager get_vars() 33277 1726883059.90038: Calling all_inventory to load vars for managed_node2 33277 1726883059.90041: Calling groups_inventory to load vars for managed_node2 33277 1726883059.90044: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883059.90079: Calling all_plugins_play to load vars for managed_node2 33277 1726883059.90082: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883059.90086: Calling groups_plugins_play to load vars for managed_node2 33277 1726883059.90471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883059.90699: done with get_vars() 33277 1726883059.90707: variable 'ansible_search_path' from source: unknown 33277 1726883059.90720: we have included files to process 33277 1726883059.90723: generating all_blocks data 33277 1726883059.90725: done generating all_blocks data 33277 1726883059.90726: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 33277 1726883059.90727: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 33277 1726883059.90730: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 33277 1726883059.91613: in VariableManager get_vars() 33277 1726883059.91783: done with get_vars() 33277 1726883059.91797: done processing included file 33277 1726883059.91799: iterating over new_blocks loaded from include file 33277 1726883059.91801: in VariableManager get_vars() 33277 1726883059.91810: done with get_vars() 33277 1726883059.91812: filtering new block on tags 33277 1726883059.91828: done filtering new block on tags 33277 1726883059.91832: in VariableManager get_vars() 33277 1726883059.91841: done with get_vars() 33277 1726883059.91843: filtering new block on tags 33277 1726883059.91859: done filtering new block on tags 33277 1726883059.91861: in VariableManager get_vars() 33277 1726883059.91944: done with get_vars() 33277 1726883059.91946: filtering new block on tags 33277 1726883059.91962: done filtering new block on tags 33277 1726883059.91964: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 33277 1726883059.91969: extending task lists for all hosts with included blocks 33277 1726883059.92061: done extending task lists 33277 1726883059.92062: done processing included files 33277 1726883059.92063: results queue empty 33277 1726883059.92064: checking for any_errors_fatal 33277 1726883059.92065: done checking for any_errors_fatal 33277 1726883059.92066: checking for max_fail_percentage 33277 1726883059.92067: done checking for max_fail_percentage 33277 1726883059.92068: checking to see if all hosts have failed and the running result is not ok 33277 1726883059.92069: done checking to see if all hosts have failed 33277 1726883059.92069: getting the remaining hosts for this loop 33277 1726883059.92071: done getting the remaining hosts for this loop 33277 1726883059.92073: getting the next task for host managed_node2 33277 1726883059.92077: done getting next task for host managed_node2 33277 1726883059.92079: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 33277 1726883059.92081: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883059.92083: getting variables 33277 1726883059.92084: in VariableManager get_vars() 33277 1726883059.92092: Calling all_inventory to load vars for managed_node2 33277 1726883059.92094: Calling groups_inventory to load vars for managed_node2 33277 1726883059.92214: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883059.92220: Calling all_plugins_play to load vars for managed_node2 33277 1726883059.92225: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883059.92228: Calling groups_plugins_play to load vars for managed_node2 33277 1726883059.92502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883059.92937: done with get_vars() 33277 1726883059.92946: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:44:19 -0400 (0:00:00.046) 0:00:03.118 ****** 33277 1726883059.93019: entering _queue_task() for managed_node2/setup 33277 1726883059.93338: worker is 1 (out of 1 available) 33277 1726883059.93350: exiting _queue_task() for managed_node2/setup 33277 1726883059.93362: done queuing things up, now waiting for results queue to drain 33277 1726883059.93364: waiting for pending results... 33277 1726883059.93739: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 33277 1726883059.93744: in run() - task 0affc7ec-ae25-6628-6da4-000000000158 33277 1726883059.93747: variable 'ansible_search_path' from source: unknown 33277 1726883059.93763: variable 'ansible_search_path' from source: unknown 33277 1726883059.93803: calling self._execute() 33277 1726883059.93890: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883059.93902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883059.93915: variable 'omit' from source: magic vars 33277 1726883059.94491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33277 1726883059.97191: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33277 1726883059.97267: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33277 1726883059.97408: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33277 1726883059.97611: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33277 1726883059.97657: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33277 1726883059.97794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33277 1726883059.97841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33277 1726883059.97886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33277 1726883059.98058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33277 1726883059.98063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33277 1726883059.98180: variable 'ansible_facts' from source: unknown 33277 1726883059.98275: variable 'network_test_required_facts' from source: task vars 33277 1726883059.98313: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 33277 1726883059.98384: variable 'omit' from source: magic vars 33277 1726883059.98387: variable 'omit' from source: magic vars 33277 1726883059.98428: variable 'omit' from source: magic vars 33277 1726883059.98472: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33277 1726883059.98524: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33277 1726883059.98550: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33277 1726883059.98602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33277 1726883059.98606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33277 1726883059.98647: variable 'inventory_hostname' from source: host vars for 'managed_node2' 33277 1726883059.98656: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883059.98671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883059.98881: Set connection var ansible_pipelining to False 33277 1726883059.98927: Set connection var ansible_connection to ssh 33277 1726883059.98933: Set connection var ansible_timeout to 10 33277 1726883059.98936: Set connection var ansible_shell_executable to /bin/sh 33277 1726883059.98939: Set connection var ansible_shell_type to sh 33277 1726883059.98941: Set connection var ansible_module_compression to ZIP_DEFLATED 33277 1726883059.99139: variable 'ansible_shell_executable' from source: unknown 33277 1726883059.99142: variable 'ansible_connection' from source: unknown 33277 1726883059.99144: variable 'ansible_module_compression' from source: unknown 33277 1726883059.99147: variable 'ansible_shell_type' from source: unknown 33277 1726883059.99149: variable 'ansible_shell_executable' from source: unknown 33277 1726883059.99151: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883059.99153: variable 'ansible_pipelining' from source: unknown 33277 1726883059.99155: variable 'ansible_timeout' from source: unknown 33277 1726883059.99157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883059.99306: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 33277 1726883059.99325: variable 'omit' from source: magic vars 33277 1726883059.99335: starting attempt loop 33277 1726883059.99341: running the handler 33277 1726883059.99359: _low_level_execute_command(): starting 33277 1726883059.99403: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33277 1726883060.01078: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33277 1726883060.01107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33277 1726883060.01130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33277 1726883060.01196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33277 1726883060.01226: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 <<< 33277 1726883060.01239: stderr chunk (state=3): >>>debug2: match not found <<< 33277 1726883060.01256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33277 1726883060.01315: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33277 1726883060.01391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' <<< 33277 1726883060.01425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33277 1726883060.01560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 33277 1726883060.04761: stdout chunk (state=3): >>>/root <<< 33277 1726883060.04765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33277 1726883060.04768: stdout chunk (state=3): >>><<< 33277 1726883060.04771: stderr chunk (state=3): >>><<< 33277 1726883060.04781: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 33277 1726883060.04783: _low_level_execute_command(): starting 33277 1726883060.04786: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883060.0470643-33446-249795417910379 `" && echo ansible-tmp-1726883060.0470643-33446-249795417910379="` echo /root/.ansible/tmp/ansible-tmp-1726883060.0470643-33446-249795417910379 `" ) && sleep 0' 33277 1726883060.05491: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33277 1726883060.05506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK <<< 33277 1726883060.05559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33277 1726883060.05629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33277 1726883060.07904: stdout chunk (state=3): >>>ansible-tmp-1726883060.0470643-33446-249795417910379=/root/.ansible/tmp/ansible-tmp-1726883060.0470643-33446-249795417910379 <<< 33277 1726883060.07908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33277 1726883060.08036: stderr chunk (state=3): >>><<< 33277 1726883060.08040: stdout chunk (state=3): >>><<< 33277 1726883060.08042: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883060.0470643-33446-249795417910379=/root/.ansible/tmp/ansible-tmp-1726883060.0470643-33446-249795417910379 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33277 1726883060.08044: variable 'ansible_module_compression' from source: unknown 33277 1726883060.08483: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-33277prfh61zr/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 33277 1726883060.08487: variable 'ansible_facts' from source: unknown 33277 1726883060.09130: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883060.0470643-33446-249795417910379/AnsiballZ_setup.py 33277 1726883060.09588: Sending initial data 33277 1726883060.09591: Sent initial data (154 bytes) 33277 1726883060.10284: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33277 1726883060.10299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33277 1726883060.10336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33277 1726883060.10353: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33277 1726883060.10441: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' <<< 33277 1726883060.10462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33277 1726883060.10477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33277 1726883060.10554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33277 1726883060.12892: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 33277 1726883060.12909: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 33277 1726883060.12928: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 33277 1726883060.12994: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 33277 1726883060.13179: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33277prfh61zr/tmpm_fbkkm2 /root/.ansible/tmp/ansible-tmp-1726883060.0470643-33446-249795417910379/AnsiballZ_setup.py <<< 33277 1726883060.13204: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883060.0470643-33446-249795417910379/AnsiballZ_setup.py" <<< 33277 1726883060.13432: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-33277prfh61zr/tmpm_fbkkm2" to remote "/root/.ansible/tmp/ansible-tmp-1726883060.0470643-33446-249795417910379/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883060.0470643-33446-249795417910379/AnsiballZ_setup.py" <<< 33277 1726883060.15861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33277 1726883060.15904: stderr chunk (state=3): >>><<< 33277 1726883060.15918: stdout chunk (state=3): >>><<< 33277 1726883060.15951: done transferring module to remote 33277 1726883060.15974: _low_level_execute_command(): starting 33277 1726883060.16001: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883060.0470643-33446-249795417910379/ /root/.ansible/tmp/ansible-tmp-1726883060.0470643-33446-249795417910379/AnsiballZ_setup.py && sleep 0' 33277 1726883060.16528: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33277 1726883060.16548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33277 1726883060.16563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33277 1726883060.16601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' <<< 33277 1726883060.16615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33277 1726883060.16666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33277 1726883060.18941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33277 1726883060.18975: stderr chunk (state=3): >>><<< 33277 1726883060.18978: stdout chunk (state=3): >>><<< 33277 1726883060.19159: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33277 1726883060.19163: _low_level_execute_command(): starting 33277 1726883060.19166: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883060.0470643-33446-249795417910379/AnsiballZ_setup.py && sleep 0' 33277 1726883060.19565: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33277 1726883060.19578: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33277 1726883060.19596: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 33277 1726883060.19626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33277 1726883060.19670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' <<< 33277 1726883060.19674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 33277 1726883060.19743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 33277 1726883060.22813: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 33277 1726883060.22868: stdout chunk (state=3): >>>import _imp # builtin <<< 33277 1726883060.22899: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 33277 1726883060.22959: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 33277 1726883060.23005: stdout chunk (state=3): >>>import 'posix' # <<< 33277 1726883060.23044: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 33277 1726883060.23069: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 33277 1726883060.23133: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 33277 1726883060.23157: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 33277 1726883060.23174: stdout chunk (state=3): >>>import 'codecs' # <<< 33277 1726883060.23229: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 33277 1726883060.23255: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 33277 1726883060.23275: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5bfc530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5bcbb30> <<< 33277 1726883060.23299: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5bfeab0> <<< 33277 1726883060.23333: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # <<< 33277 1726883060.23358: stdout chunk (state=3): >>>import 'io' # <<< 33277 1726883060.23403: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 33277 1726883060.23489: stdout chunk (state=3): >>>import '_collections_abc' # <<< 33277 1726883060.23517: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 33277 1726883060.23562: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 33277 1726883060.23576: stdout chunk (state=3): >>>Processing user site-packages <<< 33277 1726883060.23804: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f59f11c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f59f2000> import 'site' # <<< 33277 1726883060.23832: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 33277 1726883060.24517: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 33277 1726883060.24556: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 33277 1726883060.24559: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 33277 1726883060.24645: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 33277 1726883060.24648: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 33277 1726883060.24670: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 33277 1726883060.24739: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 33277 1726883060.24767: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a2fe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 33277 1726883060.24794: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a2ff50> <<< 33277 1726883060.24819: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 33277 1726883060.24881: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 33277 1726883060.24945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 33277 1726883060.24977: stdout chunk (state=3): >>>import 'itertools' # <<< 33277 1726883060.25030: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a67830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 33277 1726883060.25032: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a67ec0> <<< 33277 1726883060.25080: stdout chunk (state=3): >>>import '_collections' # <<< 33277 1726883060.25129: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a47b00> <<< 33277 1726883060.25161: stdout chunk (state=3): >>>import '_functools' # <<< 33277 1726883060.25218: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a451f0> <<< 33277 1726883060.25383: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a2d040> <<< 33277 1726883060.25462: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 33277 1726883060.25465: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 33277 1726883060.25531: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 33277 1726883060.25595: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a8b7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a8a3f0> <<< 33277 1726883060.25636: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc'<<< 33277 1726883060.25639: stdout chunk (state=3): >>> import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a462a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a88bf0> <<< 33277 1726883060.25716: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5ab8830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a2c2f0> <<< 33277 1726883060.25776: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.25793: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5ab8ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5ab8b90> <<< 33277 1726883060.25865: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5ab8f80> <<< 33277 1726883060.25900: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a2ae40> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 33277 1726883060.25938: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 33277 1726883060.25964: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5ab9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5ab9340> import 'importlib.machinery' # <<< 33277 1726883060.26006: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 33277 1726883060.26017: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5aba570> <<< 33277 1726883060.26040: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 33277 1726883060.26117: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 33277 1726883060.26140: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5ad4770> <<< 33277 1726883060.26168: stdout chunk (state=3): >>>import 'errno' # <<< 33277 1726883060.26190: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.26218: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5ad5eb0> <<< 33277 1726883060.26244: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 33277 1726883060.26291: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py<<< 33277 1726883060.26296: stdout chunk (state=3): >>> <<< 33277 1726883060.26345: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5ad6d50><<< 33277 1726883060.26596: stdout chunk (state=3): >>> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5ad73b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5ad62a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5ad7e00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5ad7530> <<< 33277 1726883060.26653: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5aba5a0> <<< 33277 1726883060.26701: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 33277 1726883060.26774: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py<<< 33277 1726883060.26778: stdout chunk (state=3): >>> <<< 33277 1726883060.26810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc'<<< 33277 1726883060.26871: stdout chunk (state=3): >>> # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.26919: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so'<<< 33277 1726883060.26926: stdout chunk (state=3): >>> import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f581bcb0> <<< 33277 1726883060.27076: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5844830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5844590> <<< 33277 1726883060.27079: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5844860> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5844a40> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5819e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 33277 1726883060.27305: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f58460f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5844d70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5abac90> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 33277 1726883060.27351: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 33277 1726883060.27370: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 33277 1726883060.27418: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 33277 1726883060.27447: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f586e480> <<< 33277 1726883060.27496: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 33277 1726883060.27534: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 33277 1726883060.27675: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 33277 1726883060.27679: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f588a630> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 33277 1726883060.27746: stdout chunk (state=3): >>>import 'ntpath' # <<< 33277 1726883060.27772: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f58bf410> <<< 33277 1726883060.27798: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 33277 1726883060.27894: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 33277 1726883060.27914: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 33277 1726883060.27983: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f58e5bb0> <<< 33277 1726883060.28073: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f58bf530> <<< 33277 1726883060.28135: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f588b2c0> <<< 33277 1726883060.28212: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f56c8590> <<< 33277 1726883060.28231: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5889670> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5847020> <<< 33277 1726883060.28394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f03f56c8860> <<< 33277 1726883060.29005: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_lspo74mm/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 33277 1726883060.29009: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 33277 1726883060.29058: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 33277 1726883060.29184: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 33277 1726883060.29215: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 33277 1726883060.29243: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5736390> import '_typing' # <<< 33277 1726883060.29571: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f570d280> <<< 33277 1726883060.29617: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f570c3e0> <<< 33277 1726883060.29640: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.29684: stdout chunk (state=3): >>>import 'ansible' # <<< 33277 1726883060.29704: stdout chunk (state=3): >>> # zipimport: zlib available<<< 33277 1726883060.29725: stdout chunk (state=3): >>> # zipimport: zlib available<<< 33277 1726883060.29770: stdout chunk (state=3): >>> # zipimport: zlib available<<< 33277 1726883060.29773: stdout chunk (state=3): >>> import 'ansible.module_utils' # <<< 33277 1726883060.29795: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.29946: stdout chunk (state=3): >>> <<< 33277 1726883060.32260: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.32266: stdout chunk (state=3): >>> <<< 33277 1726883060.34389: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py<<< 33277 1726883060.34392: stdout chunk (state=3): >>> <<< 33277 1726883060.34414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc'<<< 33277 1726883060.34421: stdout chunk (state=3): >>> <<< 33277 1726883060.34446: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5734260> <<< 33277 1726883060.34472: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 33277 1726883060.34558: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc'<<< 33277 1726883060.34564: stdout chunk (state=3): >>> <<< 33277 1726883060.34608: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so'<<< 33277 1726883060.34625: stdout chunk (state=3): >>> <<< 33277 1726883060.34629: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.34648: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5761e20> <<< 33277 1726883060.34755: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5761bb0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f57614c0><<< 33277 1726883060.34762: stdout chunk (state=3): >>> <<< 33277 1726883060.34788: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py<<< 33277 1726883060.34799: stdout chunk (state=3): >>> <<< 33277 1726883060.34818: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc'<<< 33277 1726883060.34824: stdout chunk (state=3): >>> <<< 33277 1726883060.34875: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5761910><<< 33277 1726883060.34908: stdout chunk (state=3): >>> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5737020> import 'atexit' # <<< 33277 1726883060.34951: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so'<<< 33277 1726883060.34994: stdout chunk (state=3): >>> import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5762b70> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so'<<< 33277 1726883060.35006: stdout chunk (state=3): >>> <<< 33277 1726883060.35012: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so'<<< 33277 1726883060.35029: stdout chunk (state=3): >>> <<< 33277 1726883060.35057: stdout chunk (state=3): >>>import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5762d50> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 33277 1726883060.35135: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc'<<< 33277 1726883060.35165: stdout chunk (state=3): >>> import '_locale' # <<< 33277 1726883060.35255: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5763290> import 'pwd' # <<< 33277 1726883060.35287: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py<<< 33277 1726883060.35338: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc'<<< 33277 1726883060.35342: stdout chunk (state=3): >>> <<< 33277 1726883060.35400: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55ccfb0><<< 33277 1726883060.35406: stdout chunk (state=3): >>> <<< 33277 1726883060.35438: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 33277 1726883060.35460: stdout chunk (state=3): >>> # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 33277 1726883060.35464: stdout chunk (state=3): >>> import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f55cebd0><<< 33277 1726883060.35499: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 33277 1726883060.35535: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc'<<< 33277 1726883060.35598: stdout chunk (state=3): >>> import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55cf590><<< 33277 1726883060.35602: stdout chunk (state=3): >>> <<< 33277 1726883060.35635: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py<<< 33277 1726883060.35639: stdout chunk (state=3): >>> <<< 33277 1726883060.35687: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc'<<< 33277 1726883060.35694: stdout chunk (state=3): >>> <<< 33277 1726883060.35725: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55d0770><<< 33277 1726883060.35731: stdout chunk (state=3): >>> <<< 33277 1726883060.35761: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py<<< 33277 1726883060.35819: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 33277 1726883060.35862: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 33277 1726883060.35879: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 33277 1726883060.35978: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55d3230> <<< 33277 1726883060.36032: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 33277 1726883060.36081: stdout chunk (state=3): >>> # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5846f30> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55d14f0><<< 33277 1726883060.36087: stdout chunk (state=3): >>> <<< 33277 1726883060.36118: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 33277 1726883060.36167: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc'<<< 33277 1726883060.36170: stdout chunk (state=3): >>> <<< 33277 1726883060.36204: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc'<<< 33277 1726883060.36236: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 33277 1726883060.36290: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc'<<< 33277 1726883060.36328: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py<<< 33277 1726883060.36333: stdout chunk (state=3): >>> <<< 33277 1726883060.36350: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 33277 1726883060.36386: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55d71d0> import '_tokenize' # <<< 33277 1726883060.36394: stdout chunk (state=3): >>> <<< 33277 1726883060.36494: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55d5ca0><<< 33277 1726883060.36531: stdout chunk (state=3): >>> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55d5a00> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 33277 1726883060.36557: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc'<<< 33277 1726883060.36645: stdout chunk (state=3): >>> <<< 33277 1726883060.36703: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55d5f70> <<< 33277 1726883060.36755: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55d1a00> <<< 33277 1726883060.36808: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.36814: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so'<<< 33277 1726883060.36828: stdout chunk (state=3): >>> <<< 33277 1726883060.36839: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f561b230> <<< 33277 1726883060.36886: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py<<< 33277 1726883060.36890: stdout chunk (state=3): >>> <<< 33277 1726883060.36896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' <<< 33277 1726883060.36937: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f561b440> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py<<< 33277 1726883060.36944: stdout chunk (state=3): >>> <<< 33277 1726883060.36979: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc'<<< 33277 1726883060.36982: stdout chunk (state=3): >>> <<< 33277 1726883060.37009: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py<<< 33277 1726883060.37015: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc'<<< 33277 1726883060.37045: stdout chunk (state=3): >>> <<< 33277 1726883060.37085: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.37094: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 33277 1726883060.37105: stdout chunk (state=3): >>> import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f561cef0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f561cce0><<< 33277 1726883060.37143: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 33277 1726883060.37324: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc'<<< 33277 1726883060.37403: stdout chunk (state=3): >>> # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so'<<< 33277 1726883060.37406: stdout chunk (state=3): >>> # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.37426: stdout chunk (state=3): >>>import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f561f380> <<< 33277 1726883060.37443: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f561d580><<< 33277 1726883060.37448: stdout chunk (state=3): >>> <<< 33277 1726883060.37482: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py<<< 33277 1726883060.37549: stdout chunk (state=3): >>> <<< 33277 1726883060.37566: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc'<<< 33277 1726883060.37571: stdout chunk (state=3): >>> <<< 33277 1726883060.37605: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 33277 1726883060.37632: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc'<<< 33277 1726883060.37642: stdout chunk (state=3): >>> <<< 33277 1726883060.37651: stdout chunk (state=3): >>>import '_string' # <<< 33277 1726883060.37731: stdout chunk (state=3): >>> import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f562ab40> <<< 33277 1726883060.37961: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f561f4d0> <<< 33277 1726883060.38082: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.38095: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.38104: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f562b7d0> <<< 33277 1726883060.38159: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 33277 1726883060.38164: stdout chunk (state=3): >>> <<< 33277 1726883060.38258: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f562ba10> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f562bdd0> <<< 33277 1726883060.38270: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f561b620><<< 33277 1726883060.38279: stdout chunk (state=3): >>> <<< 33277 1726883060.38310: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py<<< 33277 1726883060.38324: stdout chunk (state=3): >>> <<< 33277 1726883060.38331: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 33277 1726883060.38368: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py<<< 33277 1726883060.38375: stdout chunk (state=3): >>> <<< 33277 1726883060.38417: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc'<<< 33277 1726883060.38424: stdout chunk (state=3): >>> <<< 33277 1726883060.38463: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 33277 1726883060.38516: stdout chunk (state=3): >>> # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.38524: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f562f350> <<< 33277 1726883060.38820: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 33277 1726883060.38840: stdout chunk (state=3): >>> # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.38873: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5630800> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f562db20> <<< 33277 1726883060.38920: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 33277 1726883060.38962: stdout chunk (state=3): >>> import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f562eea0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f562d7f0> # zipimport: zlib available<<< 33277 1726883060.38968: stdout chunk (state=3): >>> <<< 33277 1726883060.38990: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.39035: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 33277 1726883060.39187: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.39345: stdout chunk (state=3): >>> # zipimport: zlib available <<< 33277 1726883060.39376: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.39393: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 33277 1726883060.39429: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.39433: stdout chunk (state=3): >>> <<< 33277 1726883060.39473: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.39476: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 33277 1726883060.39479: stdout chunk (state=3): >>> <<< 33277 1726883060.39509: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.39736: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.39742: stdout chunk (state=3): >>> <<< 33277 1726883060.39960: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.41035: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.42093: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 33277 1726883060.42114: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 33277 1726883060.42140: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 33277 1726883060.42144: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 33277 1726883060.42183: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 33277 1726883060.42348: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f54b88c0> <<< 33277 1726883060.42429: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 33277 1726883060.42436: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 33277 1726883060.42475: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f54b96a0> <<< 33277 1726883060.42495: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f562c0e0> <<< 33277 1726883060.42569: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 33277 1726883060.42590: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.42619: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.42644: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 33277 1726883060.42669: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.42929: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.43186: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 33277 1726883060.43204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 33277 1726883060.43226: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f54b9640> <<< 33277 1726883060.43241: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.44349: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.44977: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.45108: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.45235: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 33277 1726883060.45257: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.45317: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.45375: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 33277 1726883060.45396: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.45513: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.45666: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 33277 1726883060.45691: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.45713: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.45731: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 33277 1726883060.45755: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.45817: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.45876: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 33277 1726883060.45900: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.46324: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.46752: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 33277 1726883060.46850: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 33277 1726883060.46878: stdout chunk (state=3): >>>import '_ast' # <<< 33277 1726883060.46993: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f54ba390> <<< 33277 1726883060.47015: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.47134: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.47251: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 33277 1726883060.47259: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 33277 1726883060.47277: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 33277 1726883060.47296: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 33277 1726883060.47551: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.47640: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f54c2180><<< 33277 1726883060.47643: stdout chunk (state=3): >>> <<< 33277 1726883060.47718: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so'<<< 33277 1726883060.47731: stdout chunk (state=3): >>> <<< 33277 1726883060.47748: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.47754: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f54c2b10> <<< 33277 1726883060.47779: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5ab8bf0><<< 33277 1726883060.47786: stdout chunk (state=3): >>> <<< 33277 1726883060.47816: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.47819: stdout chunk (state=3): >>> <<< 33277 1726883060.47891: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.47897: stdout chunk (state=3): >>> <<< 33277 1726883060.47956: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 33277 1726883060.47962: stdout chunk (state=3): >>> <<< 33277 1726883060.47987: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.47994: stdout chunk (state=3): >>> <<< 33277 1726883060.48064: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.48069: stdout chunk (state=3): >>> <<< 33277 1726883060.48147: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.48151: stdout chunk (state=3): >>> <<< 33277 1726883060.48248: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.48251: stdout chunk (state=3): >>> <<< 33277 1726883060.48364: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py<<< 33277 1726883060.48369: stdout chunk (state=3): >>> <<< 33277 1726883060.48450: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc'<<< 33277 1726883060.48455: stdout chunk (state=3): >>> <<< 33277 1726883060.48597: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.48603: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.48623: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f54c1790><<< 33277 1726883060.48630: stdout chunk (state=3): >>> <<< 33277 1726883060.48705: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f54c2c90><<< 33277 1726883060.48708: stdout chunk (state=3): >>> <<< 33277 1726883060.48750: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 33277 1726883060.48783: stdout chunk (state=3): >>> import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 33277 1726883060.48988: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 33277 1726883060.49036: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.49114: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 33277 1726883060.49117: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc'<<< 33277 1726883060.49124: stdout chunk (state=3): >>> <<< 33277 1726883060.49170: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 33277 1726883060.49210: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 33277 1726883060.49249: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 33277 1726883060.49344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc'<<< 33277 1726883060.49349: stdout chunk (state=3): >>> <<< 33277 1726883060.49382: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py<<< 33277 1726883060.49414: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc'<<< 33277 1726883060.49421: stdout chunk (state=3): >>> <<< 33277 1726883060.49521: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5552de0><<< 33277 1726883060.49528: stdout chunk (state=3): >>> <<< 33277 1726883060.49729: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f54cca70> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f54cac00><<< 33277 1726883060.49735: stdout chunk (state=3): >>> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f54caa80><<< 33277 1726883060.49739: stdout chunk (state=3): >>> <<< 33277 1726883060.49766: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 33277 1726883060.49773: stdout chunk (state=3): >>> <<< 33277 1726883060.49800: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.49807: stdout chunk (state=3): >>> <<< 33277 1726883060.49850: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.49855: stdout chunk (state=3): >>> <<< 33277 1726883060.49904: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 33277 1726883060.49907: stdout chunk (state=3): >>> import 'ansible.module_utils.common.sys_info' # <<< 33277 1726883060.50005: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 33277 1726883060.50010: stdout chunk (state=3): >>> <<< 33277 1726883060.50036: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.50040: stdout chunk (state=3): >>> <<< 33277 1726883060.50061: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.50094: stdout chunk (state=3): >>>import 'ansible.modules' # # zipimport: zlib available<<< 33277 1726883060.50101: stdout chunk (state=3): >>> <<< 33277 1726883060.50248: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.50308: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.50346: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.50352: stdout chunk (state=3): >>> <<< 33277 1726883060.50389: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.50395: stdout chunk (state=3): >>> <<< 33277 1726883060.50460: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.50532: stdout chunk (state=3): >>> # zipimport: zlib available <<< 33277 1726883060.50596: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.50658: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 33277 1726883060.50692: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.50829: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.50965: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.50970: stdout chunk (state=3): >>> <<< 33277 1726883060.51007: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.51066: stdout chunk (state=3): >>> import 'ansible.module_utils.compat.typing' # <<< 33277 1726883060.51074: stdout chunk (state=3): >>> <<< 33277 1726883060.51093: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.51098: stdout chunk (state=3): >>> <<< 33277 1726883060.51418: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.51425: stdout chunk (state=3): >>> <<< 33277 1726883060.51732: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.51800: stdout chunk (state=3): >>> # zipimport: zlib available<<< 33277 1726883060.51804: stdout chunk (state=3): >>> <<< 33277 1726883060.51892: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 33277 1726883060.51896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc'<<< 33277 1726883060.51911: stdout chunk (state=3): >>> <<< 33277 1726883060.51947: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 33277 1726883060.51991: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 33277 1726883060.52018: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py<<< 33277 1726883060.52064: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc'<<< 33277 1726883060.52069: stdout chunk (state=3): >>> <<< 33277 1726883060.52110: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5559b80> <<< 33277 1726883060.52144: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py<<< 33277 1726883060.52174: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 33277 1726883060.52279: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 33277 1726883060.52320: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py<<< 33277 1726883060.52327: stdout chunk (state=3): >>> <<< 33277 1726883060.52352: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc'<<< 33277 1726883060.52357: stdout chunk (state=3): >>> <<< 33277 1726883060.52382: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4abc350><<< 33277 1726883060.52390: stdout chunk (state=3): >>> <<< 33277 1726883060.52431: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so'<<< 33277 1726883060.52552: stdout chunk (state=3): >>> # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f4abc680> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55393a0> <<< 33277 1726883060.52610: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5538320> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5558260><<< 33277 1726883060.52615: stdout chunk (state=3): >>> <<< 33277 1726883060.52641: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5558d10><<< 33277 1726883060.52647: stdout chunk (state=3): >>> <<< 33277 1726883060.52680: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 33277 1726883060.52779: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 33277 1726883060.52824: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc'<<< 33277 1726883060.52860: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py<<< 33277 1726883060.52869: stdout chunk (state=3): >>> <<< 33277 1726883060.52889: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 33277 1726883060.52928: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.52952: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f4abf6e0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4abef90> <<< 33277 1726883060.53017: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f4abf170> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4abe3c0><<< 33277 1726883060.53020: stdout chunk (state=3): >>> <<< 33277 1726883060.53055: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py<<< 33277 1726883060.53060: stdout chunk (state=3): >>> <<< 33277 1726883060.53244: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4abf890><<< 33277 1726883060.53251: stdout chunk (state=3): >>> <<< 33277 1726883060.53324: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 33277 1726883060.53377: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.53383: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so'<<< 33277 1726883060.53403: stdout chunk (state=3): >>> import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f4b26360> <<< 33277 1726883060.53452: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4abffe0><<< 33277 1726883060.53501: stdout chunk (state=3): >>> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55592e0><<< 33277 1726883060.53517: stdout chunk (state=3): >>> <<< 33277 1726883060.53520: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 33277 1726883060.53544: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 33277 1726883060.53572: stdout chunk (state=3): >>> # zipimport: zlib available <<< 33277 1726883060.53598: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.53639: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 33277 1726883060.53740: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.53856: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 33277 1726883060.53941: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.54041: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available<<< 33277 1726883060.54047: stdout chunk (state=3): >>> <<< 33277 1726883060.54080: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 33277 1726883060.54104: stdout chunk (state=3): >>> # zipimport: zlib available<<< 33277 1726883060.54110: stdout chunk (state=3): >>> <<< 33277 1726883060.54192: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # <<< 33277 1726883060.54201: stdout chunk (state=3): >>> <<< 33277 1726883060.54220: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.54223: stdout chunk (state=3): >>> <<< 33277 1726883060.54301: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.54347: stdout chunk (state=3): >>> <<< 33277 1726883060.54393: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 33277 1726883060.54412: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.54418: stdout chunk (state=3): >>> <<< 33277 1726883060.54488: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.54495: stdout chunk (state=3): >>> <<< 33277 1726883060.54570: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available<<< 33277 1726883060.54575: stdout chunk (state=3): >>> <<< 33277 1726883060.54748: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.54768: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.54868: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.54964: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 33277 1726883060.54984: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 33277 1726883060.55011: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.55898: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.56711: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 33277 1726883060.56739: stdout chunk (state=3): >>> # zipimport: zlib available<<< 33277 1726883060.56747: stdout chunk (state=3): >>> <<< 33277 1726883060.56828: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.56919: stdout chunk (state=3): >>> # zipimport: zlib available <<< 33277 1726883060.56979: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.57029: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 33277 1726883060.57051: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 33277 1726883060.57086: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.57132: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.57182: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 33277 1726883060.57207: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.57301: stdout chunk (state=3): >>> # zipimport: zlib available<<< 33277 1726883060.57308: stdout chunk (state=3): >>> <<< 33277 1726883060.57392: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 33277 1726883060.57397: stdout chunk (state=3): >>> <<< 33277 1726883060.57429: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.57435: stdout chunk (state=3): >>> <<< 33277 1726883060.57479: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.57485: stdout chunk (state=3): >>> <<< 33277 1726883060.57529: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 33277 1726883060.57535: stdout chunk (state=3): >>> <<< 33277 1726883060.57558: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.57561: stdout chunk (state=3): >>> <<< 33277 1726883060.57614: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.57620: stdout chunk (state=3): >>> <<< 33277 1726883060.57661: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 33277 1726883060.57688: stdout chunk (state=3): >>> # zipimport: zlib available <<< 33277 1726883060.57827: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.57945: stdout chunk (state=3): >>> <<< 33277 1726883060.57982: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 33277 1726883060.58009: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 33277 1726883060.58059: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4b26600><<< 33277 1726883060.58068: stdout chunk (state=3): >>> <<< 33277 1726883060.58098: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 33277 1726883060.58155: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 33277 1726883060.58357: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4b27260> <<< 33277 1726883060.58380: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 33277 1726883060.58411: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.58531: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.58646: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 33277 1726883060.58670: stdout chunk (state=3): >>> # zipimport: zlib available<<< 33277 1726883060.58676: stdout chunk (state=3): >>> <<< 33277 1726883060.58866: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.59040: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 33277 1726883060.59048: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.59142: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.59254: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 33277 1726883060.59265: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.59318: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.59452: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 33277 1726883060.59455: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 33277 1726883060.59575: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.59693: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.59703: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f4b5a870> <<< 33277 1726883060.60052: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4b435c0> <<< 33277 1726883060.60055: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 33277 1726883060.60246: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 33277 1726883060.60388: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.60525: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.60725: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.60967: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 33277 1726883060.60971: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # <<< 33277 1726883060.60989: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.61039: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.61103: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 33277 1726883060.61116: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.61166: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.61233: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 33277 1726883060.61279: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.61313: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.61324: stdout chunk (state=3): >>>import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f4911ee0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4b589e0> import 'ansible.module_utils.facts.system.user' # <<< 33277 1726883060.61373: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 33277 1726883060.61383: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.61525: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.61529: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 33277 1726883060.61531: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.61698: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.61887: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 33277 1726883060.61891: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.61988: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.62094: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.62135: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.62183: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 33277 1726883060.62228: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 33277 1726883060.62246: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.62397: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.62567: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 33277 1726883060.62699: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.62848: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 33277 1726883060.62851: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.62882: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.62916: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.63536: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.64116: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 33277 1726883060.64119: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.64232: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.64349: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 33277 1726883060.64462: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.64584: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 33277 1726883060.64587: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.65054: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # <<< 33277 1726883060.65061: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 33277 1726883060.65113: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.65119: stdout chunk (state=3): >>> <<< 33277 1726883060.65191: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 33277 1726883060.65197: stdout chunk (state=3): >>> <<< 33277 1726883060.65214: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.65220: stdout chunk (state=3): >>> <<< 33277 1726883060.65395: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.65401: stdout chunk (state=3): >>> <<< 33277 1726883060.65579: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.65583: stdout chunk (state=3): >>> <<< 33277 1726883060.65960: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.65966: stdout chunk (state=3): >>> <<< 33277 1726883060.66328: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 33277 1726883060.66363: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available<<< 33277 1726883060.66428: stdout chunk (state=3): >>> # zipimport: zlib available<<< 33277 1726883060.66435: stdout chunk (state=3): >>> <<< 33277 1726883060.66497: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 33277 1726883060.66502: stdout chunk (state=3): >>> <<< 33277 1726883060.66526: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.66532: stdout chunk (state=3): >>> <<< 33277 1726883060.66576: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.66582: stdout chunk (state=3): >>> <<< 33277 1726883060.66613: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 33277 1726883060.66639: stdout chunk (state=3): >>> # zipimport: zlib available <<< 33277 1726883060.66878: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # <<< 33277 1726883060.66907: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.66955: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.67001: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 33277 1726883060.67029: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.67133: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.67232: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 33277 1726883060.67261: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.67361: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.67433: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 33277 1726883060.67458: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.67866: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.68163: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 33277 1726883060.68166: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.68226: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.68307: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 33277 1726883060.68310: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.68338: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.68376: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 33277 1726883060.68390: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.68454: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.68465: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 33277 1726883060.68510: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.68538: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 33277 1726883060.68829: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 33277 1726883060.68934: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # <<< 33277 1726883060.68949: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.68976: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.68982: stdout chunk (state=3): >>> <<< 33277 1726883060.69014: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.69047: stdout chunk (state=3): >>> <<< 33277 1726883060.69112: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.69118: stdout chunk (state=3): >>> <<< 33277 1726883060.69200: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.69248: stdout chunk (state=3): >>> <<< 33277 1726883060.69335: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.69342: stdout chunk (state=3): >>> <<< 33277 1726883060.69458: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 33277 1726883060.69472: stdout chunk (state=3): >>> <<< 33277 1726883060.69485: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # <<< 33277 1726883060.69493: stdout chunk (state=3): >>> <<< 33277 1726883060.69526: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available<<< 33277 1726883060.69531: stdout chunk (state=3): >>> <<< 33277 1726883060.69609: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.69650: stdout chunk (state=3): >>> <<< 33277 1726883060.69703: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 33277 1726883060.69709: stdout chunk (state=3): >>> <<< 33277 1726883060.69730: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.69851: stdout chunk (state=3): >>> <<< 33277 1726883060.70116: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.70123: stdout chunk (state=3): >>> <<< 33277 1726883060.70477: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 33277 1726883060.70485: stdout chunk (state=3): >>> <<< 33277 1726883060.70510: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.70517: stdout chunk (state=3): >>> <<< 33277 1726883060.70597: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.70675: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 33277 1726883060.70681: stdout chunk (state=3): >>> <<< 33277 1726883060.70706: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.70792: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.70798: stdout chunk (state=3): >>> <<< 33277 1726883060.70872: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 33277 1726883060.70878: stdout chunk (state=3): >>> <<< 33277 1726883060.70895: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.70949: stdout chunk (state=3): >>> <<< 33277 1726883060.71057: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.71062: stdout chunk (state=3): >>> <<< 33277 1726883060.71202: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 33277 1726883060.71213: stdout chunk (state=3): >>> <<< 33277 1726883060.71221: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 33277 1726883060.71250: stdout chunk (state=3): >>># zipimport: zlib available<<< 33277 1726883060.71349: stdout chunk (state=3): >>> <<< 33277 1726883060.71415: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.71575: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 33277 1726883060.71578: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 33277 1726883060.71688: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883060.72720: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 33277 1726883060.72748: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 33277 1726883060.72765: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 33277 1726883060.72813: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883060.72833: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f493b8c0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4938470> <<< 33277 1726883060.72879: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4939940> <<< 33277 1726883062.28277: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "44", "second": "20", "epoch": "1726883060", "epoch_int": "1726883060", "date": "2024-09-20", "time": "21:44:20", "iso8601_micro": "2024-09-21T01:44:20.718689Z", "iso8601": "2024-09-21T01:44:20Z", "iso8601_basic": "20240920T214420718689", "iso8601_basic_short": "20240920T214420", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC4W6DnNxvXYP35ucYbXw0Rl8U5i76+nSxNOU9FlzmsJfP+uzNhFbvB4JNXnwJkgLimlQx7Nu4tUfylwwUKU5RSOnfX7+XU/7U5N+ASiAKnaE1eM8bey+vKw9yUCMWqMP2JJIgbUfrj1fualRuP7TrWzyiaD45ZlzS8WUIPQfUcjJeKBuBpKm2txHt8z07reCn9Fo3J0MgPpZzqYyBtz5cZQnqf00a57ZIS+In/5ZiOM6vvUsdnOcOJGDxnyvRpRnI/sIkkY1r225c9v45LCL1yhDWwDf5R1XcreHVgFvphaGxscm73CzunaAx07tOElGh9BdFCrRFyxdmW1+ZtrQ3PMZ09fRbdch7zE5b4TZkzJbvzN7gcJ20YhE+rMmJaOo/JHUip77V84gKyvbg1sSNgYkgUatYc4ak/dpXrGmdz5cTjowJXnle1DjXdDs/awxg1674TWMgDTcHLLj1RZ9NE6IoHTPzIBcMTgzpJlnV9v978N3Ar/pLxkGAPT8Q0+f8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDTe5a484DuRaUfRyVR9WiLfG+w2SIuQ3XCHSggW57gjmGhOPH7dR2w1D1xTofL2l7g+iaW6X0H/koP81LSjWMM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAILH/nH8SHxMjAzlrA3ts+XxnIQkq1Q/jggpukWw+sAXV", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-13-201.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-13-201", "ansible_nodename": "ip-10-31-13-201.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec271e862ced9ef36c7d9c93e54dc434", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 54870 10.31.13.201 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 54870 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 33277 1726883062.28882: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback <<< 33277 1726883062.28889: stdout chunk (state=3): >>># clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr <<< 33277 1726883062.28910: stdout chunk (state=3): >>># cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport <<< 33277 1726883062.28919: stdout chunk (state=3): >>># cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins <<< 33277 1726883062.28944: stdout chunk (state=3): >>># cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections <<< 33277 1726883062.28972: stdout chunk (state=3): >>># cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils <<< 33277 1726883062.28999: stdout chunk (state=3): >>># cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token <<< 33277 1726883062.29003: stdout chunk (state=3): >>># destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common <<< 33277 1726883062.29026: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text <<< 33277 1726883062.29041: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors <<< 33277 1726883062.29053: stdout chunk (state=3): >>># destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file <<< 33277 1726883062.29076: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 33277 1726883062.29085: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq <<< 33277 1726883062.29118: stdout chunk (state=3): >>># cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser <<< 33277 1726883062.29128: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux <<< 33277 1726883062.29149: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution <<< 33277 1726883062.29184: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd <<< 33277 1726883062.29190: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn <<< 33277 1726883062.29191: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 33277 1726883062.29525: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 33277 1726883062.29534: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 33277 1726883062.29564: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 33277 1726883062.29579: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 33277 1726883062.29607: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 33277 1726883062.29659: stdout chunk (state=3): >>># destroy ntpath <<< 33277 1726883062.29663: stdout chunk (state=3): >>># destroy importlib # destroy zipimport <<< 33277 1726883062.29665: stdout chunk (state=3): >>># destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 33277 1726883062.29696: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp <<< 33277 1726883062.29707: stdout chunk (state=3): >>># destroy encodings <<< 33277 1726883062.29738: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select <<< 33277 1726883062.29744: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess # destroy syslog <<< 33277 1726883062.29760: stdout chunk (state=3): >>># destroy uuid <<< 33277 1726883062.29794: stdout chunk (state=3): >>># destroy _hashlib <<< 33277 1726883062.29802: stdout chunk (state=3): >>># destroy _blake2 # destroy selinux # destroy shutil <<< 33277 1726883062.29819: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 33277 1726883062.29880: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool <<< 33277 1726883062.29883: stdout chunk (state=3): >>># destroy signal # destroy pickle # destroy multiprocessing.context # destroy array<<< 33277 1726883062.29890: stdout chunk (state=3): >>> # destroy _compat_pickle <<< 33277 1726883062.29921: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 33277 1726883062.29932: stdout chunk (state=3): >>># destroy multiprocessing.process <<< 33277 1726883062.29957: stdout chunk (state=3): >>># destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 33277 1726883062.29985: stdout chunk (state=3): >>># destroy shlex # destroy fcntl <<< 33277 1726883062.29988: stdout chunk (state=3): >>># destroy datetime <<< 33277 1726883062.29993: stdout chunk (state=3): >>># destroy subprocess # destroy base64 <<< 33277 1726883062.30015: stdout chunk (state=3): >>># destroy _ssl <<< 33277 1726883062.30041: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 33277 1726883062.30054: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios <<< 33277 1726883062.30062: stdout chunk (state=3): >>># destroy errno # destroy json <<< 33277 1726883062.30091: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 33277 1726883062.30094: stdout chunk (state=3): >>># destroy glob <<< 33277 1726883062.30100: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 33277 1726883062.30163: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian <<< 33277 1726883062.30180: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves <<< 33277 1726883062.30206: stdout chunk (state=3): >>># destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime <<< 33277 1726883062.30213: stdout chunk (state=3): >>># cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 33277 1726883062.30218: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 33277 1726883062.30304: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 33277 1726883062.30308: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 33277 1726883062.30348: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases <<< 33277 1726883062.30395: stdout chunk (state=3): >>># cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 33277 1726883062.30410: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 33277 1726883062.30567: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 33277 1726883062.30611: stdout chunk (state=3): >>># destroy _collections <<< 33277 1726883062.30627: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 33277 1726883062.30656: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 33277 1726883062.30698: stdout chunk (state=3): >>># destroy _typing <<< 33277 1726883062.30711: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 33277 1726883062.30749: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 33277 1726883062.30761: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 33277 1726883062.30863: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit <<< 33277 1726883062.30884: stdout chunk (state=3): >>># destroy _warnings # destroy math # destroy _bisect # destroy time <<< 33277 1726883062.30904: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 33277 1726883062.30940: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 <<< 33277 1726883062.30977: stdout chunk (state=3): >>># destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins <<< 33277 1726883062.30999: stdout chunk (state=3): >>># destroy _thread # clear sys.audit hooks <<< 33277 1726883062.31528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33277 1726883062.31559: stderr chunk (state=3): >>>Shared connection to 10.31.13.201 closed. <<< 33277 1726883062.31561: stdout chunk (state=3): >>><<< 33277 1726883062.31563: stderr chunk (state=3): >>><<< 33277 1726883062.31740: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5bfc530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5bcbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5bfeab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f59f11c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f59f2000> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a2fe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a2ff50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a67830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a67ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a47b00> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a451f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a2d040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a8b7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a8a3f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a462a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a88bf0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5ab8830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a2c2f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5ab8ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5ab8b90> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5ab8f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5a2ae40> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5ab9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5ab9340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5aba570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5ad4770> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5ad5eb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5ad6d50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5ad73b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5ad62a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5ad7e00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5ad7530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5aba5a0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f581bcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5844830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5844590> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5844860> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5844a40> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5819e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f58460f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5844d70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5abac90> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f586e480> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f588a630> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f58bf410> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f58e5bb0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f58bf530> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f588b2c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f56c8590> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5889670> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5847020> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f03f56c8860> # zipimport: found 103 names in '/tmp/ansible_setup_payload_lspo74mm/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5736390> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f570d280> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f570c3e0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5734260> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5761e20> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5761bb0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f57614c0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5761910> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5737020> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5762b70> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5762d50> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5763290> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55ccfb0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f55cebd0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55cf590> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55d0770> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55d3230> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5846f30> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55d14f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55d71d0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55d5ca0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55d5a00> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55d5f70> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55d1a00> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f561b230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f561b440> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f561cef0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f561cce0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f561f380> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f561d580> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f562ab40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f561f4d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f562b7d0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f562ba10> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f562bdd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f561b620> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f562f350> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f5630800> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f562db20> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f562eea0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f562d7f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f54b88c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f54b96a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f562c0e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f54b9640> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f54ba390> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f54c2180> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f54c2b10> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5ab8bf0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f54c1790> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f54c2c90> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5552de0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f54cca70> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f54cac00> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f54caa80> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5559b80> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4abc350> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f4abc680> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55393a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5538320> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5558260> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f5558d10> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f4abf6e0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4abef90> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f4abf170> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4abe3c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4abf890> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f4b26360> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4abffe0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f55592e0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4b26600> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4b27260> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f4b5a870> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4b435c0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f4911ee0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4b589e0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f03f493b8c0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4938470> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f03f4939940> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "44", "second": "20", "epoch": "1726883060", "epoch_int": "1726883060", "date": "2024-09-20", "time": "21:44:20", "iso8601_micro": "2024-09-21T01:44:20.718689Z", "iso8601": "2024-09-21T01:44:20Z", "iso8601_basic": "20240920T214420718689", "iso8601_basic_short": "20240920T214420", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC4W6DnNxvXYP35ucYbXw0Rl8U5i76+nSxNOU9FlzmsJfP+uzNhFbvB4JNXnwJkgLimlQx7Nu4tUfylwwUKU5RSOnfX7+XU/7U5N+ASiAKnaE1eM8bey+vKw9yUCMWqMP2JJIgbUfrj1fualRuP7TrWzyiaD45ZlzS8WUIPQfUcjJeKBuBpKm2txHt8z07reCn9Fo3J0MgPpZzqYyBtz5cZQnqf00a57ZIS+In/5ZiOM6vvUsdnOcOJGDxnyvRpRnI/sIkkY1r225c9v45LCL1yhDWwDf5R1XcreHVgFvphaGxscm73CzunaAx07tOElGh9BdFCrRFyxdmW1+ZtrQ3PMZ09fRbdch7zE5b4TZkzJbvzN7gcJ20YhE+rMmJaOo/JHUip77V84gKyvbg1sSNgYkgUatYc4ak/dpXrGmdz5cTjowJXnle1DjXdDs/awxg1674TWMgDTcHLLj1RZ9NE6IoHTPzIBcMTgzpJlnV9v978N3Ar/pLxkGAPT8Q0+f8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDTe5a484DuRaUfRyVR9WiLfG+w2SIuQ3XCHSggW57gjmGhOPH7dR2w1D1xTofL2l7g+iaW6X0H/koP81LSjWMM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAILH/nH8SHxMjAzlrA3ts+XxnIQkq1Q/jggpukWw+sAXV", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-13-201.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-13-201", "ansible_nodename": "ip-10-31-13-201.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec271e862ced9ef36c7d9c93e54dc434", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 54870 10.31.13.201 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 54870 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.13.201 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 33277 1726883062.32616: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883060.0470643-33446-249795417910379/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33277 1726883062.32620: _low_level_execute_command(): starting 33277 1726883062.32626: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883060.0470643-33446-249795417910379/ > /dev/null 2>&1 && sleep 0' 33277 1726883062.32682: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33277 1726883062.32688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33277 1726883062.32691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33277 1726883062.32700: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33277 1726883062.32756: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' <<< 33277 1726883062.32763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33277 1726883062.32766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33277 1726883062.32805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33277 1726883062.34710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33277 1726883062.34760: stderr chunk (state=3): >>><<< 33277 1726883062.34763: stdout chunk (state=3): >>><<< 33277 1726883062.34777: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33277 1726883062.34783: handler run complete 33277 1726883062.34817: variable 'ansible_facts' from source: unknown 33277 1726883062.34861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883062.34944: variable 'ansible_facts' from source: unknown 33277 1726883062.34977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883062.35014: attempt loop complete, returning result 33277 1726883062.35018: _execute() done 33277 1726883062.35020: dumping result to json 33277 1726883062.35046: done dumping result, returning 33277 1726883062.35050: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affc7ec-ae25-6628-6da4-000000000158] 33277 1726883062.35052: sending task result for task 0affc7ec-ae25-6628-6da4-000000000158 33277 1726883062.35572: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000158 33277 1726883062.35575: WORKER PROCESS EXITING ok: [managed_node2] 33277 1726883062.35755: no more pending results, returning what we have 33277 1726883062.35758: results queue empty 33277 1726883062.35759: checking for any_errors_fatal 33277 1726883062.35761: done checking for any_errors_fatal 33277 1726883062.35762: checking for max_fail_percentage 33277 1726883062.35764: done checking for max_fail_percentage 33277 1726883062.35764: checking to see if all hosts have failed and the running result is not ok 33277 1726883062.35765: done checking to see if all hosts have failed 33277 1726883062.35766: getting the remaining hosts for this loop 33277 1726883062.35768: done getting the remaining hosts for this loop 33277 1726883062.35771: getting the next task for host managed_node2 33277 1726883062.35780: done getting next task for host managed_node2 33277 1726883062.35782: ^ task is: TASK: Check if system is ostree 33277 1726883062.35788: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883062.35791: getting variables 33277 1726883062.35793: in VariableManager get_vars() 33277 1726883062.35815: Calling all_inventory to load vars for managed_node2 33277 1726883062.35818: Calling groups_inventory to load vars for managed_node2 33277 1726883062.35821: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883062.35838: Calling all_plugins_play to load vars for managed_node2 33277 1726883062.35841: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883062.35845: Calling groups_plugins_play to load vars for managed_node2 33277 1726883062.36056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883062.36309: done with get_vars() 33277 1726883062.36321: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:44:22 -0400 (0:00:02.434) 0:00:05.552 ****** 33277 1726883062.36438: entering _queue_task() for managed_node2/stat 33277 1726883062.36748: worker is 1 (out of 1 available) 33277 1726883062.36760: exiting _queue_task() for managed_node2/stat 33277 1726883062.36771: done queuing things up, now waiting for results queue to drain 33277 1726883062.36773: waiting for pending results... 33277 1726883062.37143: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 33277 1726883062.37230: in run() - task 0affc7ec-ae25-6628-6da4-00000000015a 33277 1726883062.37234: variable 'ansible_search_path' from source: unknown 33277 1726883062.37237: variable 'ansible_search_path' from source: unknown 33277 1726883062.37241: calling self._execute() 33277 1726883062.37330: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883062.37346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883062.37362: variable 'omit' from source: magic vars 33277 1726883062.37926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33277 1726883062.38237: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33277 1726883062.38357: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33277 1726883062.38361: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33277 1726883062.38373: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33277 1726883062.38475: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33277 1726883062.38509: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33277 1726883062.38544: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33277 1726883062.38583: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33277 1726883062.38730: Evaluated conditional (not __network_is_ostree is defined): True 33277 1726883062.38745: variable 'omit' from source: magic vars 33277 1726883062.38796: variable 'omit' from source: magic vars 33277 1726883062.38848: variable 'omit' from source: magic vars 33277 1726883062.38902: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33277 1726883062.38926: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33277 1726883062.39012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33277 1726883062.39016: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33277 1726883062.39018: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33277 1726883062.39031: variable 'inventory_hostname' from source: host vars for 'managed_node2' 33277 1726883062.39041: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883062.39050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883062.39172: Set connection var ansible_pipelining to False 33277 1726883062.39186: Set connection var ansible_connection to ssh 33277 1726883062.39199: Set connection var ansible_timeout to 10 33277 1726883062.39214: Set connection var ansible_shell_executable to /bin/sh 33277 1726883062.39228: Set connection var ansible_shell_type to sh 33277 1726883062.39242: Set connection var ansible_module_compression to ZIP_DEFLATED 33277 1726883062.39329: variable 'ansible_shell_executable' from source: unknown 33277 1726883062.39334: variable 'ansible_connection' from source: unknown 33277 1726883062.39338: variable 'ansible_module_compression' from source: unknown 33277 1726883062.39340: variable 'ansible_shell_type' from source: unknown 33277 1726883062.39342: variable 'ansible_shell_executable' from source: unknown 33277 1726883062.39344: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883062.39346: variable 'ansible_pipelining' from source: unknown 33277 1726883062.39348: variable 'ansible_timeout' from source: unknown 33277 1726883062.39349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883062.39628: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 33277 1726883062.39632: variable 'omit' from source: magic vars 33277 1726883062.39635: starting attempt loop 33277 1726883062.39637: running the handler 33277 1726883062.39640: _low_level_execute_command(): starting 33277 1726883062.39643: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 33277 1726883062.40340: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 33277 1726883062.40407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33277 1726883062.40458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK <<< 33277 1726883062.40525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33277 1726883062.40607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33277 1726883062.42332: stdout chunk (state=3): >>>/root <<< 33277 1726883062.42516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33277 1726883062.42532: stdout chunk (state=3): >>><<< 33277 1726883062.42548: stderr chunk (state=3): >>><<< 33277 1726883062.42578: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33277 1726883062.42695: _low_level_execute_command(): starting 33277 1726883062.42699: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726883062.4259417-33555-253078969141971 `" && echo ansible-tmp-1726883062.4259417-33555-253078969141971="` echo /root/.ansible/tmp/ansible-tmp-1726883062.4259417-33555-253078969141971 `" ) && sleep 0' 33277 1726883062.43306: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' <<< 33277 1726883062.43331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33277 1726883062.43349: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33277 1726883062.43424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33277 1726883062.45444: stdout chunk (state=3): >>>ansible-tmp-1726883062.4259417-33555-253078969141971=/root/.ansible/tmp/ansible-tmp-1726883062.4259417-33555-253078969141971 <<< 33277 1726883062.45573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33277 1726883062.45677: stderr chunk (state=3): >>><<< 33277 1726883062.45690: stdout chunk (state=3): >>><<< 33277 1726883062.45714: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726883062.4259417-33555-253078969141971=/root/.ansible/tmp/ansible-tmp-1726883062.4259417-33555-253078969141971 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33277 1726883062.45782: variable 'ansible_module_compression' from source: unknown 33277 1726883062.46046: ANSIBALLZ: Using lock for stat 33277 1726883062.46049: ANSIBALLZ: Acquiring lock 33277 1726883062.46052: ANSIBALLZ: Lock acquired: 140085462455872 33277 1726883062.46054: ANSIBALLZ: Creating module 33277 1726883062.65240: ANSIBALLZ: Writing module into payload 33277 1726883062.65352: ANSIBALLZ: Writing module 33277 1726883062.65380: ANSIBALLZ: Renaming module 33277 1726883062.65392: ANSIBALLZ: Done creating module 33277 1726883062.65415: variable 'ansible_facts' from source: unknown 33277 1726883062.65499: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726883062.4259417-33555-253078969141971/AnsiballZ_stat.py 33277 1726883062.65770: Sending initial data 33277 1726883062.65774: Sent initial data (153 bytes) 33277 1726883062.66346: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33277 1726883062.66361: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33277 1726883062.66376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33277 1726883062.66425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33277 1726883062.66505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK <<< 33277 1726883062.66535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33277 1726883062.66739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33277 1726883062.68366: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 33277 1726883062.68427: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 33277 1726883062.68496: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-33277prfh61zr/tmp9yqidyxa /root/.ansible/tmp/ansible-tmp-1726883062.4259417-33555-253078969141971/AnsiballZ_stat.py <<< 33277 1726883062.68499: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726883062.4259417-33555-253078969141971/AnsiballZ_stat.py" <<< 33277 1726883062.68574: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-33277prfh61zr/tmp9yqidyxa" to remote "/root/.ansible/tmp/ansible-tmp-1726883062.4259417-33555-253078969141971/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726883062.4259417-33555-253078969141971/AnsiballZ_stat.py" <<< 33277 1726883062.69979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33277 1726883062.70069: stderr chunk (state=3): >>><<< 33277 1726883062.70229: stdout chunk (state=3): >>><<< 33277 1726883062.70232: done transferring module to remote 33277 1726883062.70235: _low_level_execute_command(): starting 33277 1726883062.70237: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726883062.4259417-33555-253078969141971/ /root/.ansible/tmp/ansible-tmp-1726883062.4259417-33555-253078969141971/AnsiballZ_stat.py && sleep 0' 33277 1726883062.71480: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 33277 1726883062.71509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33277 1726883062.71563: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' <<< 33277 1726883062.71642: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 33277 1726883062.73638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33277 1726883062.73642: stdout chunk (state=3): >>><<< 33277 1726883062.73645: stderr chunk (state=3): >>><<< 33277 1726883062.73662: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33277 1726883062.73670: _low_level_execute_command(): starting 33277 1726883062.73679: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726883062.4259417-33555-253078969141971/AnsiballZ_stat.py && sleep 0' 33277 1726883062.74466: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33277 1726883062.74484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33277 1726883062.74500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33277 1726883062.74536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 33277 1726883062.74548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33277 1726883062.74632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' <<< 33277 1726883062.74652: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 33277 1726883062.74671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33277 1726883062.74753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33277 1726883062.77145: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 33277 1726883062.77221: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 33277 1726883062.77284: stdout chunk (state=3): >>>import 'posix' # <<< 33277 1726883062.77338: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 33277 1726883062.77350: stdout chunk (state=3): >>>import 'time' # <<< 33277 1726883062.77367: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 33277 1726883062.77400: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 33277 1726883062.77415: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 33277 1726883062.77447: stdout chunk (state=3): >>>import 'codecs' # <<< 33277 1726883062.77492: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 33277 1726883062.77510: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fadfc530> <<< 33277 1726883062.77539: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fadcbb30> <<< 33277 1726883062.77625: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fadfeab0> <<< 33277 1726883062.77628: stdout chunk (state=3): >>>import '_signal' # <<< 33277 1726883062.77645: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 33277 1726883062.77649: stdout chunk (state=3): >>>import 'io' # <<< 33277 1726883062.77677: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 33277 1726883062.77765: stdout chunk (state=3): >>>import '_collections_abc' # <<< 33277 1726883062.77803: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 33277 1726883062.77871: stdout chunk (state=3): >>>import 'os' # <<< 33277 1726883062.77882: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 33277 1726883062.77927: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 33277 1726883062.77947: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fabd11c0> <<< 33277 1726883062.78016: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 33277 1726883062.78043: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fabd2000> <<< 33277 1726883062.78061: stdout chunk (state=3): >>>import 'site' # <<< 33277 1726883062.78092: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 33277 1726883062.78365: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 33277 1726883062.78372: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 33277 1726883062.78395: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 33277 1726883062.78449: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 33277 1726883062.78489: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 33277 1726883062.78560: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac0fe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 33277 1726883062.78602: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac0ff50> <<< 33277 1726883062.78606: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 33277 1726883062.78638: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 33277 1726883062.78642: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 33277 1726883062.78695: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 33277 1726883062.78739: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac47830> <<< 33277 1726883062.78795: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac47ec0> import '_collections' # <<< 33277 1726883062.78852: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac27b00> import '_functools' # <<< 33277 1726883062.78894: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac251f0> <<< 33277 1726883062.78982: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac0d040> <<< 33277 1726883062.79041: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 33277 1726883062.79044: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 33277 1726883062.79152: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 33277 1726883062.79158: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac6b7d0> <<< 33277 1726883062.79193: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac6a3f0> <<< 33277 1726883062.79219: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 33277 1726883062.79391: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac262a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac68bf0> <<< 33277 1726883062.79400: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac98830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac0c2f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fac98ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac98b90> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883062.79446: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fac98f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac0ae40> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 33277 1726883062.79456: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 33277 1726883062.79503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac99670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac99340> import 'importlib.machinery' # <<< 33277 1726883062.79665: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac9a570> import 'importlib.util' # import 'runpy' # <<< 33277 1726883062.79705: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 33277 1726883062.79735: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38facb4770> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883062.79766: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38facb5eb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 33277 1726883062.79862: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 33277 1726883062.80089: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38facb6d50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38facb73b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38facb62a0> <<< 33277 1726883062.80093: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38facb7e00> <<< 33277 1726883062.80097: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38facb7530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac9a5a0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 33277 1726883062.80140: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38faa93cb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 33277 1726883062.80202: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38faabc830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38faabc590> <<< 33277 1726883062.80218: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38faabc860> <<< 33277 1726883062.80240: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38faabca40> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38faa91e50> <<< 33277 1726883062.80324: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 33277 1726883062.80540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 33277 1726883062.80543: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 33277 1726883062.80552: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38faabe0f0> <<< 33277 1726883062.80594: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38faabcd70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac9ac90> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 33277 1726883062.80611: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 33277 1726883062.80645: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38faae6480> <<< 33277 1726883062.80708: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 33277 1726883062.80745: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 33277 1726883062.80864: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fab02630> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 33277 1726883062.80971: stdout chunk (state=3): >>>import 'ntpath' # <<< 33277 1726883062.81040: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fab37410> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 33277 1726883062.81044: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 33277 1726883062.81379: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 33277 1726883062.81383: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fab5dbb0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fab37530> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fab032c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa940590> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fab01670> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38faabf020> <<< 33277 1726883062.81476: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 33277 1726883062.81497: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f38fa940860> <<< 33277 1726883062.81538: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_4kns4pxz/ansible_stat_payload.zip' <<< 33277 1726883062.81571: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.81706: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.81737: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 33277 1726883062.81800: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 33277 1726883062.81858: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 33277 1726883062.81898: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa99a390> <<< 33277 1726883062.81924: stdout chunk (state=3): >>>import '_typing' # <<< 33277 1726883062.82112: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa971280> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa9703e0> <<< 33277 1726883062.82247: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 33277 1726883062.83782: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.85153: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa998260> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa9c1e50> <<< 33277 1726883062.85180: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa9c1be0> <<< 33277 1726883062.85216: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa9c14f0> <<< 33277 1726883062.85244: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 33277 1726883062.85278: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa9c1f40> <<< 33277 1726883062.85298: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa99b020> import 'atexit' # <<< 33277 1726883062.85327: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa9c2bd0> <<< 33277 1726883062.85356: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa9c2d50> <<< 33277 1726883062.85449: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 33277 1726883062.85465: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 33277 1726883062.85484: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa9c3260> <<< 33277 1726883062.85508: stdout chunk (state=3): >>>import 'pwd' # <<< 33277 1726883062.85550: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 33277 1726883062.85636: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 33277 1726883062.85649: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa828f50> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa82ab70> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 33277 1726883062.85685: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa82b530> <<< 33277 1726883062.85705: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 33277 1726883062.85742: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 33277 1726883062.85856: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa82c6e0> <<< 33277 1726883062.85859: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 33277 1726883062.85862: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 33277 1726883062.85892: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa82f1d0> <<< 33277 1726883062.85931: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa82f2f0> <<< 33277 1726883062.85956: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa82d490> <<< 33277 1726883062.85973: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 33277 1726883062.86024: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 33277 1726883062.86043: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 33277 1726883062.86161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 33277 1726883062.86164: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa833140> import '_tokenize' # <<< 33277 1726883062.86275: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa831c10> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa831970> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 33277 1726883062.86303: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa831ee0> <<< 33277 1726883062.86337: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa82d9a0> <<< 33277 1726883062.86375: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa87b230> <<< 33277 1726883062.86565: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa87b380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 33277 1726883062.86583: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa87cf80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa87cd40> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 33277 1726883062.86712: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa87f470> <<< 33277 1726883062.86879: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa87d640> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 33277 1726883062.86895: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa886ba0> <<< 33277 1726883062.87159: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa87f530> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa8878c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa887d10> <<< 33277 1726883062.87253: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa887dd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa87b620> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa88b4a0> <<< 33277 1726883062.87773: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa88c9b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa889c40> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa88aff0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa889880> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 33277 1726883062.87777: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.87779: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 33277 1726883062.87782: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.87910: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.88148: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.88633: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.89233: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 33277 1726883062.89253: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 33277 1726883062.89330: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 33277 1726883062.89336: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 33277 1726883062.89347: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa910b00> <<< 33277 1726883062.89432: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 33277 1726883062.89454: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa911940> <<< 33277 1726883062.89465: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa88f4a0> <<< 33277 1726883062.89649: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 33277 1726883062.89809: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.89896: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 33277 1726883062.89917: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa911880> <<< 33277 1726883062.89925: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.90436: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.90937: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.91012: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.91333: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 33277 1726883062.91342: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.91346: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.91349: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 33277 1726883062.91352: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.91354: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 33277 1726883062.91377: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.91391: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 33277 1726883062.91398: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.91440: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.91484: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 33277 1726883062.91745: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.92138: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 33277 1726883062.92155: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa912a20> # zipimport: zlib available <<< 33277 1726883062.92231: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.92337: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 33277 1726883062.92340: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 33277 1726883062.92464: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 33277 1726883062.92467: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 33277 1726883062.92470: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883062.92627: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa722240> <<< 33277 1726883062.92650: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa722b70> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa9138c0> # zipimport: zlib available <<< 33277 1726883062.92700: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.92935: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 33277 1726883062.92981: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 33277 1726883062.93029: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 33277 1726883062.93083: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 33277 1726883062.93093: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa7217c0> <<< 33277 1726883062.93131: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa722cc0> <<< 33277 1726883062.93153: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 33277 1726883062.93169: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 33277 1726883062.93242: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.93308: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.93337: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.93446: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 33277 1726883062.93504: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 33277 1726883062.93527: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 33277 1726883062.93538: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 33277 1726883062.93599: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa7b2e10> <<< 33277 1726883062.93647: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa72caa0> <<< 33277 1726883062.93729: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa72adb0> <<< 33277 1726883062.93780: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa72ac00> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 33277 1726883062.93793: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.93896: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 33277 1726883062.94049: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 33277 1726883062.94055: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.94269: stdout chunk (state=3): >>># zipimport: zlib available <<< 33277 1726883062.94404: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 33277 1726883062.94731: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 33277 1726883062.94746: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings <<< 33277 1726883062.94768: stdout chunk (state=3): >>># cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack <<< 33277 1726883062.94801: stdout chunk (state=3): >>># destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap <<< 33277 1726883062.94817: stdout chunk (state=3): >>># cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib <<< 33277 1726883062.94874: stdout chunk (state=3): >>># cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize <<< 33277 1726883062.94880: stdout chunk (state=3): >>># cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat <<< 33277 1726883062.94954: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 33277 1726883062.95350: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux <<< 33277 1726883062.95424: stdout chunk (state=3): >>># destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 33277 1726883062.95430: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 33277 1726883062.95442: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize<<< 33277 1726883062.95469: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math <<< 33277 1726883062.95489: stdout chunk (state=3): >>># cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum <<< 33277 1726883062.95531: stdout chunk (state=3): >>># cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator <<< 33277 1726883062.95539: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 33277 1726883062.95553: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8<<< 33277 1726883062.95575: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 33277 1726883062.95589: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 33277 1726883062.95725: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 33277 1726883062.95770: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 33277 1726883062.95878: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 33277 1726883062.95942: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 33277 1726883062.96133: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 33277 1726883062.96205: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 33277 1726883062.96209: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 33277 1726883062.96412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33277 1726883062.96422: stderr chunk (state=3): >>>Shared connection to 10.31.13.201 closed. <<< 33277 1726883062.96515: stderr chunk (state=3): >>><<< 33277 1726883062.96639: stdout chunk (state=3): >>><<< 33277 1726883062.96720: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fadfc530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fadcbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fadfeab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fabd11c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fabd2000> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac0fe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac0ff50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac47830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac47ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac27b00> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac251f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac0d040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac6b7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac6a3f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac262a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac68bf0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac98830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac0c2f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fac98ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac98b90> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fac98f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac0ae40> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac99670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac99340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac9a570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38facb4770> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38facb5eb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38facb6d50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38facb73b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38facb62a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38facb7e00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38facb7530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac9a5a0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38faa93cb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38faabc830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38faabc590> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38faabc860> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38faabca40> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38faa91e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38faabe0f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38faabcd70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fac9ac90> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38faae6480> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fab02630> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fab37410> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fab5dbb0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fab37530> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fab032c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa940590> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fab01670> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38faabf020> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f38fa940860> # zipimport: found 30 names in '/tmp/ansible_stat_payload_4kns4pxz/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa99a390> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa971280> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa9703e0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa998260> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa9c1e50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa9c1be0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa9c14f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa9c1f40> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa99b020> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa9c2bd0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa9c2d50> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa9c3260> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa828f50> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa82ab70> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa82b530> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa82c6e0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa82f1d0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa82f2f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa82d490> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa833140> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa831c10> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa831970> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa831ee0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa82d9a0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa87b230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa87b380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa87cf80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa87cd40> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa87f470> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa87d640> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa886ba0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa87f530> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa8878c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa887d10> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa887dd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa87b620> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa88b4a0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa88c9b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa889c40> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa88aff0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa889880> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa910b00> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa911940> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa88f4a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa911880> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa912a20> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa722240> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa722b70> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa9138c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38fa7217c0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa722cc0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa7b2e10> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa72caa0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa72adb0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38fa72ac00> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.201 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 33277 1726883062.97860: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726883062.4259417-33555-253078969141971/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 33277 1726883062.97864: _low_level_execute_command(): starting 33277 1726883062.97866: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726883062.4259417-33555-253078969141971/ > /dev/null 2>&1 && sleep 0' 33277 1726883062.98111: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 33277 1726883062.98239: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 33277 1726883062.98250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 33277 1726883062.98269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 33277 1726883062.98281: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 <<< 33277 1726883062.98329: stderr chunk (state=3): >>>debug2: match not found <<< 33277 1726883062.98333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 33277 1726883062.98336: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 33277 1726883062.98610: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK <<< 33277 1726883062.98631: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 33277 1726883062.98698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 33277 1726883063.00691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 33277 1726883063.00727: stderr chunk (state=3): >>><<< 33277 1726883063.00938: stdout chunk (state=3): >>><<< 33277 1726883063.00955: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.201 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.201 originally 10.31.13.201 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2837f5298a' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 33277 1726883063.00962: handler run complete 33277 1726883063.00986: attempt loop complete, returning result 33277 1726883063.00991: _execute() done 33277 1726883063.00994: dumping result to json 33277 1726883063.01016: done dumping result, returning 33277 1726883063.01019: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [0affc7ec-ae25-6628-6da4-00000000015a] 33277 1726883063.01021: sending task result for task 0affc7ec-ae25-6628-6da4-00000000015a ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 33277 1726883063.01332: no more pending results, returning what we have 33277 1726883063.01335: results queue empty 33277 1726883063.01336: checking for any_errors_fatal 33277 1726883063.01342: done checking for any_errors_fatal 33277 1726883063.01343: checking for max_fail_percentage 33277 1726883063.01345: done checking for max_fail_percentage 33277 1726883063.01345: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.01346: done checking to see if all hosts have failed 33277 1726883063.01347: getting the remaining hosts for this loop 33277 1726883063.01349: done getting the remaining hosts for this loop 33277 1726883063.01353: getting the next task for host managed_node2 33277 1726883063.01358: done getting next task for host managed_node2 33277 1726883063.01361: ^ task is: TASK: Set flag to indicate system is ostree 33277 1726883063.01363: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.01367: getting variables 33277 1726883063.01368: in VariableManager get_vars() 33277 1726883063.01403: Calling all_inventory to load vars for managed_node2 33277 1726883063.01406: Calling groups_inventory to load vars for managed_node2 33277 1726883063.01410: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.01421: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.01541: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.01547: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.01874: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000015a 33277 1726883063.01879: WORKER PROCESS EXITING 33277 1726883063.01899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.02573: done with get_vars() 33277 1726883063.02584: done getting variables 33277 1726883063.02689: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:44:23 -0400 (0:00:00.662) 0:00:06.215 ****** 33277 1726883063.02719: entering _queue_task() for managed_node2/set_fact 33277 1726883063.02721: Creating lock for set_fact 33277 1726883063.03435: worker is 1 (out of 1 available) 33277 1726883063.03447: exiting _queue_task() for managed_node2/set_fact 33277 1726883063.03458: done queuing things up, now waiting for results queue to drain 33277 1726883063.03459: waiting for pending results... 33277 1726883063.03837: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 33277 1726883063.04174: in run() - task 0affc7ec-ae25-6628-6da4-00000000015b 33277 1726883063.04197: variable 'ansible_search_path' from source: unknown 33277 1726883063.04205: variable 'ansible_search_path' from source: unknown 33277 1726883063.04502: calling self._execute() 33277 1726883063.04516: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.04532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.04547: variable 'omit' from source: magic vars 33277 1726883063.05615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 33277 1726883063.06184: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 33277 1726883063.06428: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 33277 1726883063.06431: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 33277 1726883063.06434: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 33277 1726883063.06615: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 33277 1726883063.06704: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 33277 1726883063.06740: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 33277 1726883063.06814: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 33277 1726883063.07138: Evaluated conditional (not __network_is_ostree is defined): True 33277 1726883063.07150: variable 'omit' from source: magic vars 33277 1726883063.07195: variable 'omit' from source: magic vars 33277 1726883063.07632: variable '__ostree_booted_stat' from source: set_fact 33277 1726883063.07635: variable 'omit' from source: magic vars 33277 1726883063.07664: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33277 1726883063.07758: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33277 1726883063.07877: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33277 1726883063.07880: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33277 1726883063.07883: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33277 1726883063.07886: variable 'inventory_hostname' from source: host vars for 'managed_node2' 33277 1726883063.07888: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.07890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.08076: Set connection var ansible_pipelining to False 33277 1726883063.08086: Set connection var ansible_connection to ssh 33277 1726883063.08102: Set connection var ansible_timeout to 10 33277 1726883063.08114: Set connection var ansible_shell_executable to /bin/sh 33277 1726883063.08121: Set connection var ansible_shell_type to sh 33277 1726883063.08133: Set connection var ansible_module_compression to ZIP_DEFLATED 33277 1726883063.08157: variable 'ansible_shell_executable' from source: unknown 33277 1726883063.08165: variable 'ansible_connection' from source: unknown 33277 1726883063.08203: variable 'ansible_module_compression' from source: unknown 33277 1726883063.08206: variable 'ansible_shell_type' from source: unknown 33277 1726883063.08209: variable 'ansible_shell_executable' from source: unknown 33277 1726883063.08211: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.08213: variable 'ansible_pipelining' from source: unknown 33277 1726883063.08215: variable 'ansible_timeout' from source: unknown 33277 1726883063.08217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.08354: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 33277 1726883063.08371: variable 'omit' from source: magic vars 33277 1726883063.08421: starting attempt loop 33277 1726883063.08424: running the handler 33277 1726883063.08429: handler run complete 33277 1726883063.08432: attempt loop complete, returning result 33277 1726883063.08435: _execute() done 33277 1726883063.08438: dumping result to json 33277 1726883063.08442: done dumping result, returning 33277 1726883063.08449: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [0affc7ec-ae25-6628-6da4-00000000015b] 33277 1726883063.08458: sending task result for task 0affc7ec-ae25-6628-6da4-00000000015b 33277 1726883063.08597: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000015b 33277 1726883063.08601: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 33277 1726883063.08660: no more pending results, returning what we have 33277 1726883063.08663: results queue empty 33277 1726883063.08665: checking for any_errors_fatal 33277 1726883063.08673: done checking for any_errors_fatal 33277 1726883063.08673: checking for max_fail_percentage 33277 1726883063.08675: done checking for max_fail_percentage 33277 1726883063.08675: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.08676: done checking to see if all hosts have failed 33277 1726883063.08677: getting the remaining hosts for this loop 33277 1726883063.08678: done getting the remaining hosts for this loop 33277 1726883063.08682: getting the next task for host managed_node2 33277 1726883063.08694: done getting next task for host managed_node2 33277 1726883063.08697: ^ task is: TASK: Fix CentOS6 Base repo 33277 1726883063.08699: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.08703: getting variables 33277 1726883063.08704: in VariableManager get_vars() 33277 1726883063.08736: Calling all_inventory to load vars for managed_node2 33277 1726883063.08739: Calling groups_inventory to load vars for managed_node2 33277 1726883063.08742: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.08753: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.08755: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.08765: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.09166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.09609: done with get_vars() 33277 1726883063.09620: done getting variables 33277 1726883063.09950: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:44:23 -0400 (0:00:00.072) 0:00:06.288 ****** 33277 1726883063.09979: entering _queue_task() for managed_node2/copy 33277 1726883063.10569: worker is 1 (out of 1 available) 33277 1726883063.10580: exiting _queue_task() for managed_node2/copy 33277 1726883063.10593: done queuing things up, now waiting for results queue to drain 33277 1726883063.10595: waiting for pending results... 33277 1726883063.11141: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 33277 1726883063.11151: in run() - task 0affc7ec-ae25-6628-6da4-00000000015d 33277 1726883063.11155: variable 'ansible_search_path' from source: unknown 33277 1726883063.11158: variable 'ansible_search_path' from source: unknown 33277 1726883063.11233: calling self._execute() 33277 1726883063.11318: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.11629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.11632: variable 'omit' from source: magic vars 33277 1726883063.12552: variable 'ansible_distribution' from source: facts 33277 1726883063.12691: Evaluated conditional (ansible_distribution == 'CentOS'): False 33277 1726883063.12776: when evaluation is False, skipping this task 33277 1726883063.12779: _execute() done 33277 1726883063.12781: dumping result to json 33277 1726883063.12784: done dumping result, returning 33277 1726883063.12786: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [0affc7ec-ae25-6628-6da4-00000000015d] 33277 1726883063.12788: sending task result for task 0affc7ec-ae25-6628-6da4-00000000015d 33277 1726883063.12870: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000015d 33277 1726883063.12873: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 33277 1726883063.12963: no more pending results, returning what we have 33277 1726883063.12968: results queue empty 33277 1726883063.12969: checking for any_errors_fatal 33277 1726883063.12974: done checking for any_errors_fatal 33277 1726883063.12974: checking for max_fail_percentage 33277 1726883063.12976: done checking for max_fail_percentage 33277 1726883063.12976: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.12977: done checking to see if all hosts have failed 33277 1726883063.12978: getting the remaining hosts for this loop 33277 1726883063.12979: done getting the remaining hosts for this loop 33277 1726883063.12984: getting the next task for host managed_node2 33277 1726883063.12994: done getting next task for host managed_node2 33277 1726883063.12997: ^ task is: TASK: Include the task 'enable_epel.yml' 33277 1726883063.12999: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.13004: getting variables 33277 1726883063.13005: in VariableManager get_vars() 33277 1726883063.13038: Calling all_inventory to load vars for managed_node2 33277 1726883063.13041: Calling groups_inventory to load vars for managed_node2 33277 1726883063.13045: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.13060: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.13063: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.13066: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.13509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.14156: done with get_vars() 33277 1726883063.14167: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:44:23 -0400 (0:00:00.042) 0:00:06.331 ****** 33277 1726883063.14269: entering _queue_task() for managed_node2/include_tasks 33277 1726883063.15005: worker is 1 (out of 1 available) 33277 1726883063.15019: exiting _queue_task() for managed_node2/include_tasks 33277 1726883063.15232: done queuing things up, now waiting for results queue to drain 33277 1726883063.15234: waiting for pending results... 33277 1726883063.15516: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 33277 1726883063.15930: in run() - task 0affc7ec-ae25-6628-6da4-00000000015e 33277 1726883063.15934: variable 'ansible_search_path' from source: unknown 33277 1726883063.15937: variable 'ansible_search_path' from source: unknown 33277 1726883063.15939: calling self._execute() 33277 1726883063.15977: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.16049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.16064: variable 'omit' from source: magic vars 33277 1726883063.17152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 33277 1726883063.22685: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 33277 1726883063.22690: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 33277 1726883063.22769: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 33277 1726883063.22946: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 33277 1726883063.22980: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 33277 1726883063.23228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 33277 1726883063.23235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 33277 1726883063.23265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 33277 1726883063.23384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 33277 1726883063.23466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 33277 1726883063.23675: variable '__network_is_ostree' from source: set_fact 33277 1726883063.23699: Evaluated conditional (not __network_is_ostree | d(false)): True 33277 1726883063.23711: _execute() done 33277 1726883063.23719: dumping result to json 33277 1726883063.23731: done dumping result, returning 33277 1726883063.23744: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [0affc7ec-ae25-6628-6da4-00000000015e] 33277 1726883063.23754: sending task result for task 0affc7ec-ae25-6628-6da4-00000000015e 33277 1726883063.23880: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000015e 33277 1726883063.23889: WORKER PROCESS EXITING 33277 1726883063.24051: no more pending results, returning what we have 33277 1726883063.24056: in VariableManager get_vars() 33277 1726883063.24093: Calling all_inventory to load vars for managed_node2 33277 1726883063.24096: Calling groups_inventory to load vars for managed_node2 33277 1726883063.24100: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.24111: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.24113: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.24116: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.24534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.25181: done with get_vars() 33277 1726883063.25192: variable 'ansible_search_path' from source: unknown 33277 1726883063.25194: variable 'ansible_search_path' from source: unknown 33277 1726883063.25439: we have included files to process 33277 1726883063.25440: generating all_blocks data 33277 1726883063.25441: done generating all_blocks data 33277 1726883063.25445: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 33277 1726883063.25446: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 33277 1726883063.25449: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 33277 1726883063.27036: done processing included file 33277 1726883063.27038: iterating over new_blocks loaded from include file 33277 1726883063.27040: in VariableManager get_vars() 33277 1726883063.27054: done with get_vars() 33277 1726883063.27055: filtering new block on tags 33277 1726883063.27080: done filtering new block on tags 33277 1726883063.27083: in VariableManager get_vars() 33277 1726883063.27099: done with get_vars() 33277 1726883063.27101: filtering new block on tags 33277 1726883063.27113: done filtering new block on tags 33277 1726883063.27115: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 33277 1726883063.27121: extending task lists for all hosts with included blocks 33277 1726883063.27458: done extending task lists 33277 1726883063.27574: done processing included files 33277 1726883063.27575: results queue empty 33277 1726883063.27576: checking for any_errors_fatal 33277 1726883063.27579: done checking for any_errors_fatal 33277 1726883063.27580: checking for max_fail_percentage 33277 1726883063.27581: done checking for max_fail_percentage 33277 1726883063.27582: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.27583: done checking to see if all hosts have failed 33277 1726883063.27584: getting the remaining hosts for this loop 33277 1726883063.27587: done getting the remaining hosts for this loop 33277 1726883063.27591: getting the next task for host managed_node2 33277 1726883063.27595: done getting next task for host managed_node2 33277 1726883063.27597: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 33277 1726883063.27600: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.27602: getting variables 33277 1726883063.27603: in VariableManager get_vars() 33277 1726883063.27612: Calling all_inventory to load vars for managed_node2 33277 1726883063.27615: Calling groups_inventory to load vars for managed_node2 33277 1726883063.27617: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.27625: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.27634: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.27637: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.27995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.28547: done with get_vars() 33277 1726883063.28563: done getting variables 33277 1726883063.28793: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 33277 1726883063.29174: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 40] ********************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:44:23 -0400 (0:00:00.150) 0:00:06.481 ****** 33277 1726883063.29341: entering _queue_task() for managed_node2/command 33277 1726883063.29343: Creating lock for command 33277 1726883063.30052: worker is 1 (out of 1 available) 33277 1726883063.30065: exiting _queue_task() for managed_node2/command 33277 1726883063.30126: done queuing things up, now waiting for results queue to drain 33277 1726883063.30128: waiting for pending results... 33277 1726883063.30843: running TaskExecutor() for managed_node2/TASK: Create EPEL 40 33277 1726883063.30848: in run() - task 0affc7ec-ae25-6628-6da4-000000000178 33277 1726883063.30852: variable 'ansible_search_path' from source: unknown 33277 1726883063.30854: variable 'ansible_search_path' from source: unknown 33277 1726883063.30857: calling self._execute() 33277 1726883063.31228: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.31232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.31235: variable 'omit' from source: magic vars 33277 1726883063.31913: variable 'ansible_distribution' from source: facts 33277 1726883063.31934: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 33277 1726883063.31942: when evaluation is False, skipping this task 33277 1726883063.31950: _execute() done 33277 1726883063.31958: dumping result to json 33277 1726883063.31966: done dumping result, returning 33277 1726883063.31977: done running TaskExecutor() for managed_node2/TASK: Create EPEL 40 [0affc7ec-ae25-6628-6da4-000000000178] 33277 1726883063.31986: sending task result for task 0affc7ec-ae25-6628-6da4-000000000178 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 33277 1726883063.32406: no more pending results, returning what we have 33277 1726883063.32410: results queue empty 33277 1726883063.32411: checking for any_errors_fatal 33277 1726883063.32413: done checking for any_errors_fatal 33277 1726883063.32414: checking for max_fail_percentage 33277 1726883063.32415: done checking for max_fail_percentage 33277 1726883063.32416: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.32417: done checking to see if all hosts have failed 33277 1726883063.32417: getting the remaining hosts for this loop 33277 1726883063.32419: done getting the remaining hosts for this loop 33277 1726883063.32425: getting the next task for host managed_node2 33277 1726883063.32431: done getting next task for host managed_node2 33277 1726883063.32434: ^ task is: TASK: Install yum-utils package 33277 1726883063.32437: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.32442: getting variables 33277 1726883063.32444: in VariableManager get_vars() 33277 1726883063.32475: Calling all_inventory to load vars for managed_node2 33277 1726883063.32477: Calling groups_inventory to load vars for managed_node2 33277 1726883063.32481: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.32497: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.32500: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.32503: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.32817: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000178 33277 1726883063.32820: WORKER PROCESS EXITING 33277 1726883063.33088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.33704: done with get_vars() 33277 1726883063.33716: done getting variables 33277 1726883063.33937: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:44:23 -0400 (0:00:00.046) 0:00:06.528 ****** 33277 1726883063.33968: entering _queue_task() for managed_node2/package 33277 1726883063.33970: Creating lock for package 33277 1726883063.34679: worker is 1 (out of 1 available) 33277 1726883063.34694: exiting _queue_task() for managed_node2/package 33277 1726883063.34707: done queuing things up, now waiting for results queue to drain 33277 1726883063.34709: waiting for pending results... 33277 1726883063.35240: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 33277 1726883063.35245: in run() - task 0affc7ec-ae25-6628-6da4-000000000179 33277 1726883063.35248: variable 'ansible_search_path' from source: unknown 33277 1726883063.35250: variable 'ansible_search_path' from source: unknown 33277 1726883063.35628: calling self._execute() 33277 1726883063.35632: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.35636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.35640: variable 'omit' from source: magic vars 33277 1726883063.36235: variable 'ansible_distribution' from source: facts 33277 1726883063.36628: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 33277 1726883063.36631: when evaluation is False, skipping this task 33277 1726883063.36634: _execute() done 33277 1726883063.36637: dumping result to json 33277 1726883063.36639: done dumping result, returning 33277 1726883063.36642: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [0affc7ec-ae25-6628-6da4-000000000179] 33277 1726883063.36644: sending task result for task 0affc7ec-ae25-6628-6da4-000000000179 33277 1726883063.36719: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000179 33277 1726883063.36725: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 33277 1726883063.36796: no more pending results, returning what we have 33277 1726883063.36800: results queue empty 33277 1726883063.36801: checking for any_errors_fatal 33277 1726883063.36811: done checking for any_errors_fatal 33277 1726883063.36812: checking for max_fail_percentage 33277 1726883063.36814: done checking for max_fail_percentage 33277 1726883063.36814: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.36815: done checking to see if all hosts have failed 33277 1726883063.36816: getting the remaining hosts for this loop 33277 1726883063.36817: done getting the remaining hosts for this loop 33277 1726883063.36823: getting the next task for host managed_node2 33277 1726883063.36830: done getting next task for host managed_node2 33277 1726883063.36832: ^ task is: TASK: Enable EPEL 7 33277 1726883063.36836: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.36840: getting variables 33277 1726883063.36842: in VariableManager get_vars() 33277 1726883063.36985: Calling all_inventory to load vars for managed_node2 33277 1726883063.36991: Calling groups_inventory to load vars for managed_node2 33277 1726883063.36994: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.37004: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.37007: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.37010: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.37371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.37928: done with get_vars() 33277 1726883063.37938: done getting variables 33277 1726883063.38005: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:44:23 -0400 (0:00:00.042) 0:00:06.570 ****** 33277 1726883063.38245: entering _queue_task() for managed_node2/command 33277 1726883063.38759: worker is 1 (out of 1 available) 33277 1726883063.38768: exiting _queue_task() for managed_node2/command 33277 1726883063.38779: done queuing things up, now waiting for results queue to drain 33277 1726883063.38781: waiting for pending results... 33277 1726883063.39023: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 33277 1726883063.39251: in run() - task 0affc7ec-ae25-6628-6da4-00000000017a 33277 1726883063.39391: variable 'ansible_search_path' from source: unknown 33277 1726883063.39433: variable 'ansible_search_path' from source: unknown 33277 1726883063.39829: calling self._execute() 33277 1726883063.39832: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.39835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.39939: variable 'omit' from source: magic vars 33277 1726883063.40704: variable 'ansible_distribution' from source: facts 33277 1726883063.40726: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 33277 1726883063.40815: when evaluation is False, skipping this task 33277 1726883063.40827: _execute() done 33277 1726883063.40835: dumping result to json 33277 1726883063.40843: done dumping result, returning 33277 1726883063.40854: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [0affc7ec-ae25-6628-6da4-00000000017a] 33277 1726883063.40864: sending task result for task 0affc7ec-ae25-6628-6da4-00000000017a skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 33277 1726883063.41033: no more pending results, returning what we have 33277 1726883063.41038: results queue empty 33277 1726883063.41039: checking for any_errors_fatal 33277 1726883063.41047: done checking for any_errors_fatal 33277 1726883063.41048: checking for max_fail_percentage 33277 1726883063.41049: done checking for max_fail_percentage 33277 1726883063.41050: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.41051: done checking to see if all hosts have failed 33277 1726883063.41052: getting the remaining hosts for this loop 33277 1726883063.41053: done getting the remaining hosts for this loop 33277 1726883063.41058: getting the next task for host managed_node2 33277 1726883063.41065: done getting next task for host managed_node2 33277 1726883063.41068: ^ task is: TASK: Enable EPEL 8 33277 1726883063.41072: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.41077: getting variables 33277 1726883063.41078: in VariableManager get_vars() 33277 1726883063.41113: Calling all_inventory to load vars for managed_node2 33277 1726883063.41117: Calling groups_inventory to load vars for managed_node2 33277 1726883063.41123: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.41149: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.41152: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.41156: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.41681: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000017a 33277 1726883063.41690: WORKER PROCESS EXITING 33277 1726883063.41731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.41976: done with get_vars() 33277 1726883063.41988: done getting variables 33277 1726883063.42053: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:44:23 -0400 (0:00:00.038) 0:00:06.609 ****** 33277 1726883063.42083: entering _queue_task() for managed_node2/command 33277 1726883063.42350: worker is 1 (out of 1 available) 33277 1726883063.42364: exiting _queue_task() for managed_node2/command 33277 1726883063.42376: done queuing things up, now waiting for results queue to drain 33277 1726883063.42378: waiting for pending results... 33277 1726883063.42665: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 33277 1726883063.42796: in run() - task 0affc7ec-ae25-6628-6da4-00000000017b 33277 1726883063.42817: variable 'ansible_search_path' from source: unknown 33277 1726883063.42830: variable 'ansible_search_path' from source: unknown 33277 1726883063.42881: calling self._execute() 33277 1726883063.42976: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.42994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.43010: variable 'omit' from source: magic vars 33277 1726883063.43480: variable 'ansible_distribution' from source: facts 33277 1726883063.43505: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 33277 1726883063.43517: when evaluation is False, skipping this task 33277 1726883063.43528: _execute() done 33277 1726883063.43536: dumping result to json 33277 1726883063.43582: done dumping result, returning 33277 1726883063.43593: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [0affc7ec-ae25-6628-6da4-00000000017b] 33277 1726883063.43610: sending task result for task 0affc7ec-ae25-6628-6da4-00000000017b 33277 1726883063.43799: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000017b 33277 1726883063.43802: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 33277 1726883063.43859: no more pending results, returning what we have 33277 1726883063.43864: results queue empty 33277 1726883063.43865: checking for any_errors_fatal 33277 1726883063.43870: done checking for any_errors_fatal 33277 1726883063.43871: checking for max_fail_percentage 33277 1726883063.43872: done checking for max_fail_percentage 33277 1726883063.43873: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.43874: done checking to see if all hosts have failed 33277 1726883063.43875: getting the remaining hosts for this loop 33277 1726883063.43876: done getting the remaining hosts for this loop 33277 1726883063.43881: getting the next task for host managed_node2 33277 1726883063.43895: done getting next task for host managed_node2 33277 1726883063.43898: ^ task is: TASK: Enable EPEL 6 33277 1726883063.43902: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.43906: getting variables 33277 1726883063.43908: in VariableManager get_vars() 33277 1726883063.44052: Calling all_inventory to load vars for managed_node2 33277 1726883063.44056: Calling groups_inventory to load vars for managed_node2 33277 1726883063.44059: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.44069: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.44071: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.44074: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.44338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.44598: done with get_vars() 33277 1726883063.44609: done getting variables 33277 1726883063.44676: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:44:23 -0400 (0:00:00.026) 0:00:06.635 ****** 33277 1726883063.44718: entering _queue_task() for managed_node2/copy 33277 1726883063.45123: worker is 1 (out of 1 available) 33277 1726883063.45142: exiting _queue_task() for managed_node2/copy 33277 1726883063.45153: done queuing things up, now waiting for results queue to drain 33277 1726883063.45155: waiting for pending results... 33277 1726883063.45447: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 33277 1726883063.45542: in run() - task 0affc7ec-ae25-6628-6da4-00000000017d 33277 1726883063.45547: variable 'ansible_search_path' from source: unknown 33277 1726883063.45550: variable 'ansible_search_path' from source: unknown 33277 1726883063.45554: calling self._execute() 33277 1726883063.45633: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.45658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.45674: variable 'omit' from source: magic vars 33277 1726883063.46226: variable 'ansible_distribution' from source: facts 33277 1726883063.46248: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 33277 1726883063.46256: when evaluation is False, skipping this task 33277 1726883063.46263: _execute() done 33277 1726883063.46310: dumping result to json 33277 1726883063.46313: done dumping result, returning 33277 1726883063.46317: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [0affc7ec-ae25-6628-6da4-00000000017d] 33277 1726883063.46320: sending task result for task 0affc7ec-ae25-6628-6da4-00000000017d skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 33277 1726883063.46584: no more pending results, returning what we have 33277 1726883063.46591: results queue empty 33277 1726883063.46593: checking for any_errors_fatal 33277 1726883063.46600: done checking for any_errors_fatal 33277 1726883063.46601: checking for max_fail_percentage 33277 1726883063.46602: done checking for max_fail_percentage 33277 1726883063.46603: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.46604: done checking to see if all hosts have failed 33277 1726883063.46605: getting the remaining hosts for this loop 33277 1726883063.46607: done getting the remaining hosts for this loop 33277 1726883063.46612: getting the next task for host managed_node2 33277 1726883063.46623: done getting next task for host managed_node2 33277 1726883063.46627: ^ task is: TASK: Set network provider to 'nm' 33277 1726883063.46629: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.46633: getting variables 33277 1726883063.46635: in VariableManager get_vars() 33277 1726883063.46667: Calling all_inventory to load vars for managed_node2 33277 1726883063.46670: Calling groups_inventory to load vars for managed_node2 33277 1726883063.46674: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.46692: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.46696: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.46699: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.47078: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000017d 33277 1726883063.47082: WORKER PROCESS EXITING 33277 1726883063.47108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.47341: done with get_vars() 33277 1726883063.47352: done getting variables 33277 1726883063.47424: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:13 Friday 20 September 2024 21:44:23 -0400 (0:00:00.027) 0:00:06.662 ****** 33277 1726883063.47451: entering _queue_task() for managed_node2/set_fact 33277 1726883063.47850: worker is 1 (out of 1 available) 33277 1726883063.47862: exiting _queue_task() for managed_node2/set_fact 33277 1726883063.47874: done queuing things up, now waiting for results queue to drain 33277 1726883063.47875: waiting for pending results... 33277 1726883063.48059: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 33277 1726883063.48164: in run() - task 0affc7ec-ae25-6628-6da4-000000000007 33277 1726883063.48184: variable 'ansible_search_path' from source: unknown 33277 1726883063.48238: calling self._execute() 33277 1726883063.48331: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.48344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.48375: variable 'omit' from source: magic vars 33277 1726883063.48490: variable 'omit' from source: magic vars 33277 1726883063.48540: variable 'omit' from source: magic vars 33277 1726883063.48595: variable 'omit' from source: magic vars 33277 1726883063.48635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 33277 1726883063.48683: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 33277 1726883063.48812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 33277 1726883063.48816: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33277 1726883063.48819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 33277 1726883063.48821: variable 'inventory_hostname' from source: host vars for 'managed_node2' 33277 1726883063.48825: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.48827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.48935: Set connection var ansible_pipelining to False 33277 1726883063.48953: Set connection var ansible_connection to ssh 33277 1726883063.48964: Set connection var ansible_timeout to 10 33277 1726883063.48976: Set connection var ansible_shell_executable to /bin/sh 33277 1726883063.48983: Set connection var ansible_shell_type to sh 33277 1726883063.48997: Set connection var ansible_module_compression to ZIP_DEFLATED 33277 1726883063.49030: variable 'ansible_shell_executable' from source: unknown 33277 1726883063.49039: variable 'ansible_connection' from source: unknown 33277 1726883063.49127: variable 'ansible_module_compression' from source: unknown 33277 1726883063.49131: variable 'ansible_shell_type' from source: unknown 33277 1726883063.49138: variable 'ansible_shell_executable' from source: unknown 33277 1726883063.49141: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.49143: variable 'ansible_pipelining' from source: unknown 33277 1726883063.49145: variable 'ansible_timeout' from source: unknown 33277 1726883063.49147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.49258: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 33277 1726883063.49279: variable 'omit' from source: magic vars 33277 1726883063.49292: starting attempt loop 33277 1726883063.49299: running the handler 33277 1726883063.49328: handler run complete 33277 1726883063.49331: attempt loop complete, returning result 33277 1726883063.49357: _execute() done 33277 1726883063.49360: dumping result to json 33277 1726883063.49363: done dumping result, returning 33277 1726883063.49365: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [0affc7ec-ae25-6628-6da4-000000000007] 33277 1726883063.49378: sending task result for task 0affc7ec-ae25-6628-6da4-000000000007 ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 33277 1726883063.49653: no more pending results, returning what we have 33277 1726883063.49657: results queue empty 33277 1726883063.49658: checking for any_errors_fatal 33277 1726883063.49665: done checking for any_errors_fatal 33277 1726883063.49666: checking for max_fail_percentage 33277 1726883063.49668: done checking for max_fail_percentage 33277 1726883063.49668: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.49669: done checking to see if all hosts have failed 33277 1726883063.49670: getting the remaining hosts for this loop 33277 1726883063.49672: done getting the remaining hosts for this loop 33277 1726883063.49676: getting the next task for host managed_node2 33277 1726883063.49734: done getting next task for host managed_node2 33277 1726883063.49737: ^ task is: TASK: meta (flush_handlers) 33277 1726883063.49739: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.49744: getting variables 33277 1726883063.49746: in VariableManager get_vars() 33277 1726883063.49776: Calling all_inventory to load vars for managed_node2 33277 1726883063.49779: Calling groups_inventory to load vars for managed_node2 33277 1726883063.49783: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.49898: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000007 33277 1726883063.49901: WORKER PROCESS EXITING 33277 1726883063.49913: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.49917: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.49920: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.50133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.50405: done with get_vars() 33277 1726883063.50415: done getting variables 33277 1726883063.50494: in VariableManager get_vars() 33277 1726883063.50503: Calling all_inventory to load vars for managed_node2 33277 1726883063.50506: Calling groups_inventory to load vars for managed_node2 33277 1726883063.50508: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.50513: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.50515: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.50519: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.50760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.51336: done with get_vars() 33277 1726883063.51351: done queuing things up, now waiting for results queue to drain 33277 1726883063.51353: results queue empty 33277 1726883063.51354: checking for any_errors_fatal 33277 1726883063.51356: done checking for any_errors_fatal 33277 1726883063.51357: checking for max_fail_percentage 33277 1726883063.51362: done checking for max_fail_percentage 33277 1726883063.51363: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.51364: done checking to see if all hosts have failed 33277 1726883063.51365: getting the remaining hosts for this loop 33277 1726883063.51366: done getting the remaining hosts for this loop 33277 1726883063.51369: getting the next task for host managed_node2 33277 1726883063.51373: done getting next task for host managed_node2 33277 1726883063.51375: ^ task is: TASK: meta (flush_handlers) 33277 1726883063.51377: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.51389: getting variables 33277 1726883063.51390: in VariableManager get_vars() 33277 1726883063.51399: Calling all_inventory to load vars for managed_node2 33277 1726883063.51401: Calling groups_inventory to load vars for managed_node2 33277 1726883063.51403: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.51408: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.51410: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.51413: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.51776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.52245: done with get_vars() 33277 1726883063.52253: done getting variables 33277 1726883063.52305: in VariableManager get_vars() 33277 1726883063.52313: Calling all_inventory to load vars for managed_node2 33277 1726883063.52316: Calling groups_inventory to load vars for managed_node2 33277 1726883063.52318: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.52325: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.52328: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.52331: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.52514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.52759: done with get_vars() 33277 1726883063.52771: done queuing things up, now waiting for results queue to drain 33277 1726883063.52772: results queue empty 33277 1726883063.52778: checking for any_errors_fatal 33277 1726883063.52779: done checking for any_errors_fatal 33277 1726883063.52780: checking for max_fail_percentage 33277 1726883063.52781: done checking for max_fail_percentage 33277 1726883063.52782: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.52783: done checking to see if all hosts have failed 33277 1726883063.52784: getting the remaining hosts for this loop 33277 1726883063.52785: done getting the remaining hosts for this loop 33277 1726883063.52790: getting the next task for host managed_node2 33277 1726883063.52793: done getting next task for host managed_node2 33277 1726883063.52794: ^ task is: None 33277 1726883063.52795: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.52796: done queuing things up, now waiting for results queue to drain 33277 1726883063.52797: results queue empty 33277 1726883063.52798: checking for any_errors_fatal 33277 1726883063.52799: done checking for any_errors_fatal 33277 1726883063.52800: checking for max_fail_percentage 33277 1726883063.52801: done checking for max_fail_percentage 33277 1726883063.52801: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.52802: done checking to see if all hosts have failed 33277 1726883063.52804: getting the next task for host managed_node2 33277 1726883063.52806: done getting next task for host managed_node2 33277 1726883063.52807: ^ task is: None 33277 1726883063.52809: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.52860: in VariableManager get_vars() 33277 1726883063.52902: done with get_vars() 33277 1726883063.52909: in VariableManager get_vars() 33277 1726883063.52933: done with get_vars() 33277 1726883063.52938: variable 'omit' from source: magic vars 33277 1726883063.52971: in VariableManager get_vars() 33277 1726883063.53000: done with get_vars() 33277 1726883063.53028: variable 'omit' from source: magic vars PLAY [Play for testing wireless connection] ************************************ 33277 1726883063.53897: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 33277 1726883063.53928: getting the remaining hosts for this loop 33277 1726883063.53930: done getting the remaining hosts for this loop 33277 1726883063.53933: getting the next task for host managed_node2 33277 1726883063.53936: done getting next task for host managed_node2 33277 1726883063.53938: ^ task is: TASK: Gathering Facts 33277 1726883063.53940: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.53942: getting variables 33277 1726883063.53944: in VariableManager get_vars() 33277 1726883063.53962: Calling all_inventory to load vars for managed_node2 33277 1726883063.53965: Calling groups_inventory to load vars for managed_node2 33277 1726883063.53972: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.53978: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.53996: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.54000: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.54207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.54561: done with get_vars() 33277 1726883063.54575: done getting variables 33277 1726883063.54626: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:3 Friday 20 September 2024 21:44:23 -0400 (0:00:00.071) 0:00:06.734 ****** 33277 1726883063.54649: entering _queue_task() for managed_node2/gather_facts 33277 1726883063.54984: worker is 1 (out of 1 available) 33277 1726883063.54998: exiting _queue_task() for managed_node2/gather_facts 33277 1726883063.55123: done queuing things up, now waiting for results queue to drain 33277 1726883063.55127: waiting for pending results... 33277 1726883063.55294: running TaskExecutor() for managed_node2/TASK: Gathering Facts 33277 1726883063.55428: in run() - task 0affc7ec-ae25-6628-6da4-0000000001a3 33277 1726883063.55432: variable 'ansible_search_path' from source: unknown 33277 1726883063.55605: calling self._execute() 33277 1726883063.55767: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.55905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.55909: variable 'omit' from source: magic vars 33277 1726883063.56275: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.56294: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883063.56433: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.56452: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883063.56466: when evaluation is False, skipping this task 33277 1726883063.56473: _execute() done 33277 1726883063.56479: dumping result to json 33277 1726883063.56570: done dumping result, returning 33277 1726883063.56574: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affc7ec-ae25-6628-6da4-0000000001a3] 33277 1726883063.56577: sending task result for task 0affc7ec-ae25-6628-6da4-0000000001a3 33277 1726883063.56647: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000001a3 33277 1726883063.56651: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883063.56727: no more pending results, returning what we have 33277 1726883063.56732: results queue empty 33277 1726883063.56733: checking for any_errors_fatal 33277 1726883063.56734: done checking for any_errors_fatal 33277 1726883063.56735: checking for max_fail_percentage 33277 1726883063.56736: done checking for max_fail_percentage 33277 1726883063.56737: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.56738: done checking to see if all hosts have failed 33277 1726883063.56739: getting the remaining hosts for this loop 33277 1726883063.56740: done getting the remaining hosts for this loop 33277 1726883063.56744: getting the next task for host managed_node2 33277 1726883063.56751: done getting next task for host managed_node2 33277 1726883063.56753: ^ task is: TASK: meta (flush_handlers) 33277 1726883063.56756: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.56762: getting variables 33277 1726883063.56764: in VariableManager get_vars() 33277 1726883063.56820: Calling all_inventory to load vars for managed_node2 33277 1726883063.56825: Calling groups_inventory to load vars for managed_node2 33277 1726883063.56828: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.56843: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.56846: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.56850: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.57303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.57551: done with get_vars() 33277 1726883063.57562: done getting variables 33277 1726883063.57641: in VariableManager get_vars() 33277 1726883063.57698: Calling all_inventory to load vars for managed_node2 33277 1726883063.57701: Calling groups_inventory to load vars for managed_node2 33277 1726883063.57704: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.57709: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.57711: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.57714: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.57906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.58143: done with get_vars() 33277 1726883063.58158: done queuing things up, now waiting for results queue to drain 33277 1726883063.58160: results queue empty 33277 1726883063.58161: checking for any_errors_fatal 33277 1726883063.58163: done checking for any_errors_fatal 33277 1726883063.58164: checking for max_fail_percentage 33277 1726883063.58165: done checking for max_fail_percentage 33277 1726883063.58166: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.58167: done checking to see if all hosts have failed 33277 1726883063.58168: getting the remaining hosts for this loop 33277 1726883063.58169: done getting the remaining hosts for this loop 33277 1726883063.58171: getting the next task for host managed_node2 33277 1726883063.58176: done getting next task for host managed_node2 33277 1726883063.58178: ^ task is: TASK: INIT: wireless tests 33277 1726883063.58179: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.58181: getting variables 33277 1726883063.58182: in VariableManager get_vars() 33277 1726883063.58199: Calling all_inventory to load vars for managed_node2 33277 1726883063.58201: Calling groups_inventory to load vars for managed_node2 33277 1726883063.58204: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.58209: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.58211: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.58214: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.58383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.58629: done with get_vars() 33277 1726883063.58638: done getting variables 33277 1726883063.58726: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT: wireless tests] **************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:8 Friday 20 September 2024 21:44:23 -0400 (0:00:00.040) 0:00:06.775 ****** 33277 1726883063.58751: entering _queue_task() for managed_node2/debug 33277 1726883063.58753: Creating lock for debug 33277 1726883063.59082: worker is 1 (out of 1 available) 33277 1726883063.59210: exiting _queue_task() for managed_node2/debug 33277 1726883063.59220: done queuing things up, now waiting for results queue to drain 33277 1726883063.59226: waiting for pending results... 33277 1726883063.59463: running TaskExecutor() for managed_node2/TASK: INIT: wireless tests 33277 1726883063.59528: in run() - task 0affc7ec-ae25-6628-6da4-00000000000b 33277 1726883063.59531: variable 'ansible_search_path' from source: unknown 33277 1726883063.59649: calling self._execute() 33277 1726883063.59670: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.59682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.59697: variable 'omit' from source: magic vars 33277 1726883063.60106: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.60125: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883063.60260: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.60272: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883063.60280: when evaluation is False, skipping this task 33277 1726883063.60287: _execute() done 33277 1726883063.60299: dumping result to json 33277 1726883063.60306: done dumping result, returning 33277 1726883063.60318: done running TaskExecutor() for managed_node2/TASK: INIT: wireless tests [0affc7ec-ae25-6628-6da4-00000000000b] 33277 1726883063.60333: sending task result for task 0affc7ec-ae25-6628-6da4-00000000000b 33277 1726883063.60550: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000000b 33277 1726883063.60553: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "ansible_distribution_major_version == '7'" } 33277 1726883063.60601: no more pending results, returning what we have 33277 1726883063.60606: results queue empty 33277 1726883063.60607: checking for any_errors_fatal 33277 1726883063.60609: done checking for any_errors_fatal 33277 1726883063.60610: checking for max_fail_percentage 33277 1726883063.60611: done checking for max_fail_percentage 33277 1726883063.60612: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.60614: done checking to see if all hosts have failed 33277 1726883063.60614: getting the remaining hosts for this loop 33277 1726883063.60616: done getting the remaining hosts for this loop 33277 1726883063.60624: getting the next task for host managed_node2 33277 1726883063.60633: done getting next task for host managed_node2 33277 1726883063.60636: ^ task is: TASK: Include the task 'setup_mock_wifi.yml' 33277 1726883063.60638: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.60642: getting variables 33277 1726883063.60646: in VariableManager get_vars() 33277 1726883063.60698: Calling all_inventory to load vars for managed_node2 33277 1726883063.60701: Calling groups_inventory to load vars for managed_node2 33277 1726883063.60704: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.60718: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.60836: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.60843: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.61075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.61364: done with get_vars() 33277 1726883063.61374: done getting variables TASK [Include the task 'setup_mock_wifi.yml'] ********************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:11 Friday 20 September 2024 21:44:23 -0400 (0:00:00.027) 0:00:06.803 ****** 33277 1726883063.61470: entering _queue_task() for managed_node2/include_tasks 33277 1726883063.61841: worker is 1 (out of 1 available) 33277 1726883063.61851: exiting _queue_task() for managed_node2/include_tasks 33277 1726883063.61862: done queuing things up, now waiting for results queue to drain 33277 1726883063.61863: waiting for pending results... 33277 1726883063.62053: running TaskExecutor() for managed_node2/TASK: Include the task 'setup_mock_wifi.yml' 33277 1726883063.62199: in run() - task 0affc7ec-ae25-6628-6da4-00000000000c 33277 1726883063.62203: variable 'ansible_search_path' from source: unknown 33277 1726883063.62220: calling self._execute() 33277 1726883063.62313: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.62326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.62340: variable 'omit' from source: magic vars 33277 1726883063.62802: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.62806: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883063.62927: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.62939: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883063.62946: when evaluation is False, skipping this task 33277 1726883063.62957: _execute() done 33277 1726883063.62964: dumping result to json 33277 1726883063.63020: done dumping result, returning 33277 1726883063.63025: done running TaskExecutor() for managed_node2/TASK: Include the task 'setup_mock_wifi.yml' [0affc7ec-ae25-6628-6da4-00000000000c] 33277 1726883063.63027: sending task result for task 0affc7ec-ae25-6628-6da4-00000000000c 33277 1726883063.63107: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000000c skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883063.63263: no more pending results, returning what we have 33277 1726883063.63267: results queue empty 33277 1726883063.63268: checking for any_errors_fatal 33277 1726883063.63279: done checking for any_errors_fatal 33277 1726883063.63279: checking for max_fail_percentage 33277 1726883063.63281: done checking for max_fail_percentage 33277 1726883063.63281: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.63282: done checking to see if all hosts have failed 33277 1726883063.63283: getting the remaining hosts for this loop 33277 1726883063.63287: done getting the remaining hosts for this loop 33277 1726883063.63294: getting the next task for host managed_node2 33277 1726883063.63300: done getting next task for host managed_node2 33277 1726883063.63303: ^ task is: TASK: Copy client certs 33277 1726883063.63306: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.63309: getting variables 33277 1726883063.63310: in VariableManager get_vars() 33277 1726883063.63362: Calling all_inventory to load vars for managed_node2 33277 1726883063.63365: Calling groups_inventory to load vars for managed_node2 33277 1726883063.63368: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.63382: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.63385: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.63391: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.63715: WORKER PROCESS EXITING 33277 1726883063.63741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.64001: done with get_vars() 33277 1726883063.64011: done getting variables 33277 1726883063.64074: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Copy client certs] ******************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:13 Friday 20 September 2024 21:44:23 -0400 (0:00:00.026) 0:00:06.829 ****** 33277 1726883063.64107: entering _queue_task() for managed_node2/copy 33277 1726883063.64459: worker is 1 (out of 1 available) 33277 1726883063.64472: exiting _queue_task() for managed_node2/copy 33277 1726883063.64485: done queuing things up, now waiting for results queue to drain 33277 1726883063.64489: waiting for pending results... 33277 1726883063.64981: running TaskExecutor() for managed_node2/TASK: Copy client certs 33277 1726883063.65166: in run() - task 0affc7ec-ae25-6628-6da4-00000000000d 33277 1726883063.65194: variable 'ansible_search_path' from source: unknown 33277 1726883063.65766: Loaded config def from plugin (lookup/items) 33277 1726883063.65770: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 33277 1726883063.65858: variable 'omit' from source: magic vars 33277 1726883063.66204: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.66208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.66211: variable 'omit' from source: magic vars 33277 1726883063.67202: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.67225: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883063.67589: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.67593: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883063.67596: when evaluation is False, skipping this task 33277 1726883063.67599: variable 'item' from source: unknown 33277 1726883063.67770: variable 'item' from source: unknown skipping: [managed_node2] => (item=client.key) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "client.key", "skip_reason": "Conditional result was False" } 33277 1726883063.68163: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.68167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.68169: variable 'omit' from source: magic vars 33277 1726883063.68599: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.68708: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883063.68849: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.68860: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883063.68866: when evaluation is False, skipping this task 33277 1726883063.68898: variable 'item' from source: unknown 33277 1726883063.69081: variable 'item' from source: unknown skipping: [managed_node2] => (item=client.pem) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "client.pem", "skip_reason": "Conditional result was False" } 33277 1726883063.69527: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.69531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.69534: variable 'omit' from source: magic vars 33277 1726883063.69625: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.69870: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883063.69948: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.69989: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883063.70197: when evaluation is False, skipping this task 33277 1726883063.70201: variable 'item' from source: unknown 33277 1726883063.70204: variable 'item' from source: unknown skipping: [managed_node2] => (item=cacert.pem) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "cacert.pem", "skip_reason": "Conditional result was False" } 33277 1726883063.70525: dumping result to json 33277 1726883063.70531: done dumping result, returning 33277 1726883063.70535: done running TaskExecutor() for managed_node2/TASK: Copy client certs [0affc7ec-ae25-6628-6da4-00000000000d] 33277 1726883063.70537: sending task result for task 0affc7ec-ae25-6628-6da4-00000000000d 33277 1726883063.70584: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000000d 33277 1726883063.70589: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false } MSG: All items skipped 33277 1726883063.70672: no more pending results, returning what we have 33277 1726883063.70675: results queue empty 33277 1726883063.70676: checking for any_errors_fatal 33277 1726883063.70681: done checking for any_errors_fatal 33277 1726883063.70682: checking for max_fail_percentage 33277 1726883063.70684: done checking for max_fail_percentage 33277 1726883063.70685: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.70688: done checking to see if all hosts have failed 33277 1726883063.70689: getting the remaining hosts for this loop 33277 1726883063.70690: done getting the remaining hosts for this loop 33277 1726883063.70695: getting the next task for host managed_node2 33277 1726883063.70702: done getting next task for host managed_node2 33277 1726883063.70705: ^ task is: TASK: TEST: wireless connection with WPA-PSK 33277 1726883063.70708: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.70711: getting variables 33277 1726883063.70713: in VariableManager get_vars() 33277 1726883063.70776: Calling all_inventory to load vars for managed_node2 33277 1726883063.70779: Calling groups_inventory to load vars for managed_node2 33277 1726883063.70782: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.70800: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.70803: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.70807: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.71604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.72265: done with get_vars() 33277 1726883063.72277: done getting variables 33277 1726883063.72347: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: wireless connection with WPA-PSK] ********************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:24 Friday 20 September 2024 21:44:23 -0400 (0:00:00.083) 0:00:06.912 ****** 33277 1726883063.72434: entering _queue_task() for managed_node2/debug 33277 1726883063.73270: worker is 1 (out of 1 available) 33277 1726883063.73283: exiting _queue_task() for managed_node2/debug 33277 1726883063.73297: done queuing things up, now waiting for results queue to drain 33277 1726883063.73299: waiting for pending results... 33277 1726883063.73846: running TaskExecutor() for managed_node2/TASK: TEST: wireless connection with WPA-PSK 33277 1726883063.73853: in run() - task 0affc7ec-ae25-6628-6da4-00000000000f 33277 1726883063.73856: variable 'ansible_search_path' from source: unknown 33277 1726883063.74058: calling self._execute() 33277 1726883063.74327: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.74332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.74334: variable 'omit' from source: magic vars 33277 1726883063.75155: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.75175: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883063.75497: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.75509: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883063.75516: when evaluation is False, skipping this task 33277 1726883063.75526: _execute() done 33277 1726883063.75534: dumping result to json 33277 1726883063.75543: done dumping result, returning 33277 1726883063.75554: done running TaskExecutor() for managed_node2/TASK: TEST: wireless connection with WPA-PSK [0affc7ec-ae25-6628-6da4-00000000000f] 33277 1726883063.75564: sending task result for task 0affc7ec-ae25-6628-6da4-00000000000f 33277 1726883063.75915: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000000f 33277 1726883063.75918: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "ansible_distribution_major_version == '7'" } 33277 1726883063.75973: no more pending results, returning what we have 33277 1726883063.75977: results queue empty 33277 1726883063.75978: checking for any_errors_fatal 33277 1726883063.75990: done checking for any_errors_fatal 33277 1726883063.75991: checking for max_fail_percentage 33277 1726883063.75992: done checking for max_fail_percentage 33277 1726883063.75993: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.75994: done checking to see if all hosts have failed 33277 1726883063.75995: getting the remaining hosts for this loop 33277 1726883063.75996: done getting the remaining hosts for this loop 33277 1726883063.76001: getting the next task for host managed_node2 33277 1726883063.76008: done getting next task for host managed_node2 33277 1726883063.76014: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 33277 1726883063.76018: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.76040: getting variables 33277 1726883063.76043: in VariableManager get_vars() 33277 1726883063.76100: Calling all_inventory to load vars for managed_node2 33277 1726883063.76104: Calling groups_inventory to load vars for managed_node2 33277 1726883063.76106: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.76120: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.76352: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.76358: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.76930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.77875: done with get_vars() 33277 1726883063.77889: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:44:23 -0400 (0:00:00.057) 0:00:06.969 ****** 33277 1726883063.78148: entering _queue_task() for managed_node2/include_tasks 33277 1726883063.78869: worker is 1 (out of 1 available) 33277 1726883063.78884: exiting _queue_task() for managed_node2/include_tasks 33277 1726883063.78900: done queuing things up, now waiting for results queue to drain 33277 1726883063.78902: waiting for pending results... 33277 1726883063.79619: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 33277 1726883063.79746: in run() - task 0affc7ec-ae25-6628-6da4-000000000017 33277 1726883063.79770: variable 'ansible_search_path' from source: unknown 33277 1726883063.79777: variable 'ansible_search_path' from source: unknown 33277 1726883063.80028: calling self._execute() 33277 1726883063.80032: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.80265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.80268: variable 'omit' from source: magic vars 33277 1726883063.81141: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.81145: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883063.81310: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.81323: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883063.81331: when evaluation is False, skipping this task 33277 1726883063.81338: _execute() done 33277 1726883063.81344: dumping result to json 33277 1726883063.81356: done dumping result, returning 33277 1726883063.81369: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affc7ec-ae25-6628-6da4-000000000017] 33277 1726883063.81428: sending task result for task 0affc7ec-ae25-6628-6da4-000000000017 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883063.81595: no more pending results, returning what we have 33277 1726883063.81600: results queue empty 33277 1726883063.81601: checking for any_errors_fatal 33277 1726883063.81611: done checking for any_errors_fatal 33277 1726883063.81611: checking for max_fail_percentage 33277 1726883063.81613: done checking for max_fail_percentage 33277 1726883063.81614: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.81615: done checking to see if all hosts have failed 33277 1726883063.81616: getting the remaining hosts for this loop 33277 1726883063.81617: done getting the remaining hosts for this loop 33277 1726883063.81624: getting the next task for host managed_node2 33277 1726883063.81631: done getting next task for host managed_node2 33277 1726883063.81636: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 33277 1726883063.81639: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.81664: getting variables 33277 1726883063.81666: in VariableManager get_vars() 33277 1726883063.81982: Calling all_inventory to load vars for managed_node2 33277 1726883063.81989: Calling groups_inventory to load vars for managed_node2 33277 1726883063.81992: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.82006: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.82009: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.82012: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.82574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.83132: done with get_vars() 33277 1726883063.83144: done getting variables 33277 1726883063.83160: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000017 33277 1726883063.83163: WORKER PROCESS EXITING 33277 1726883063.83372: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:44:23 -0400 (0:00:00.052) 0:00:07.022 ****** 33277 1726883063.83410: entering _queue_task() for managed_node2/debug 33277 1726883063.84006: worker is 1 (out of 1 available) 33277 1726883063.84021: exiting _queue_task() for managed_node2/debug 33277 1726883063.84267: done queuing things up, now waiting for results queue to drain 33277 1726883063.84269: waiting for pending results... 33277 1726883063.84613: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 33277 1726883063.84871: in run() - task 0affc7ec-ae25-6628-6da4-000000000018 33277 1726883063.84935: variable 'ansible_search_path' from source: unknown 33277 1726883063.85132: variable 'ansible_search_path' from source: unknown 33277 1726883063.85136: calling self._execute() 33277 1726883063.85265: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.85278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.85295: variable 'omit' from source: magic vars 33277 1726883063.86111: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.86350: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883063.86666: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.86670: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883063.86672: when evaluation is False, skipping this task 33277 1726883063.86675: _execute() done 33277 1726883063.86678: dumping result to json 33277 1726883063.86680: done dumping result, returning 33277 1726883063.86683: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affc7ec-ae25-6628-6da4-000000000018] 33277 1726883063.86688: sending task result for task 0affc7ec-ae25-6628-6da4-000000000018 33277 1726883063.86764: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000018 skipping: [managed_node2] => { "false_condition": "ansible_distribution_major_version == '7'" } 33277 1726883063.86828: no more pending results, returning what we have 33277 1726883063.86833: results queue empty 33277 1726883063.86834: checking for any_errors_fatal 33277 1726883063.86844: done checking for any_errors_fatal 33277 1726883063.86844: checking for max_fail_percentage 33277 1726883063.86846: done checking for max_fail_percentage 33277 1726883063.86847: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.86849: done checking to see if all hosts have failed 33277 1726883063.86849: getting the remaining hosts for this loop 33277 1726883063.86851: done getting the remaining hosts for this loop 33277 1726883063.86855: getting the next task for host managed_node2 33277 1726883063.86862: done getting next task for host managed_node2 33277 1726883063.86867: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 33277 1726883063.86870: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.86893: getting variables 33277 1726883063.86895: in VariableManager get_vars() 33277 1726883063.86953: Calling all_inventory to load vars for managed_node2 33277 1726883063.86956: Calling groups_inventory to load vars for managed_node2 33277 1726883063.86958: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.86972: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.86974: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.86977: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.87666: WORKER PROCESS EXITING 33277 1726883063.87915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.88390: done with get_vars() 33277 1726883063.88402: done getting variables 33277 1726883063.88627: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:44:23 -0400 (0:00:00.053) 0:00:07.076 ****** 33277 1726883063.88777: entering _queue_task() for managed_node2/fail 33277 1726883063.88780: Creating lock for fail 33277 1726883063.89502: worker is 1 (out of 1 available) 33277 1726883063.89519: exiting _queue_task() for managed_node2/fail 33277 1726883063.89532: done queuing things up, now waiting for results queue to drain 33277 1726883063.89534: waiting for pending results... 33277 1726883063.89996: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 33277 1726883063.90156: in run() - task 0affc7ec-ae25-6628-6da4-000000000019 33277 1726883063.90187: variable 'ansible_search_path' from source: unknown 33277 1726883063.90196: variable 'ansible_search_path' from source: unknown 33277 1726883063.90242: calling self._execute() 33277 1726883063.90347: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.90374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.90381: variable 'omit' from source: magic vars 33277 1726883063.90923: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.90932: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883063.91009: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.91020: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883063.91032: when evaluation is False, skipping this task 33277 1726883063.91044: _execute() done 33277 1726883063.91055: dumping result to json 33277 1726883063.91062: done dumping result, returning 33277 1726883063.91075: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affc7ec-ae25-6628-6da4-000000000019] 33277 1726883063.91088: sending task result for task 0affc7ec-ae25-6628-6da4-000000000019 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883063.91380: no more pending results, returning what we have 33277 1726883063.91388: results queue empty 33277 1726883063.91389: checking for any_errors_fatal 33277 1726883063.91395: done checking for any_errors_fatal 33277 1726883063.91395: checking for max_fail_percentage 33277 1726883063.91397: done checking for max_fail_percentage 33277 1726883063.91397: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.91398: done checking to see if all hosts have failed 33277 1726883063.91399: getting the remaining hosts for this loop 33277 1726883063.91400: done getting the remaining hosts for this loop 33277 1726883063.91406: getting the next task for host managed_node2 33277 1726883063.91412: done getting next task for host managed_node2 33277 1726883063.91416: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 33277 1726883063.91419: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.91438: getting variables 33277 1726883063.91440: in VariableManager get_vars() 33277 1726883063.91496: Calling all_inventory to load vars for managed_node2 33277 1726883063.91500: Calling groups_inventory to load vars for managed_node2 33277 1726883063.91502: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.91515: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.91518: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.91521: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.92251: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000019 33277 1726883063.92255: WORKER PROCESS EXITING 33277 1726883063.92417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.92810: done with get_vars() 33277 1726883063.92824: done getting variables 33277 1726883063.93008: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:44:23 -0400 (0:00:00.043) 0:00:07.119 ****** 33277 1726883063.93149: entering _queue_task() for managed_node2/fail 33277 1726883063.93802: worker is 1 (out of 1 available) 33277 1726883063.93815: exiting _queue_task() for managed_node2/fail 33277 1726883063.94084: done queuing things up, now waiting for results queue to drain 33277 1726883063.94089: waiting for pending results... 33277 1726883063.94434: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 33277 1726883063.94741: in run() - task 0affc7ec-ae25-6628-6da4-00000000001a 33277 1726883063.94850: variable 'ansible_search_path' from source: unknown 33277 1726883063.94855: variable 'ansible_search_path' from source: unknown 33277 1726883063.94900: calling self._execute() 33277 1726883063.95151: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883063.95165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883063.95185: variable 'omit' from source: magic vars 33277 1726883063.96159: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.96180: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883063.96820: variable 'ansible_distribution_major_version' from source: facts 33277 1726883063.96826: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883063.96828: when evaluation is False, skipping this task 33277 1726883063.96831: _execute() done 33277 1726883063.96834: dumping result to json 33277 1726883063.96836: done dumping result, returning 33277 1726883063.96840: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affc7ec-ae25-6628-6da4-00000000001a] 33277 1726883063.96842: sending task result for task 0affc7ec-ae25-6628-6da4-00000000001a skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883063.97182: no more pending results, returning what we have 33277 1726883063.97188: results queue empty 33277 1726883063.97190: checking for any_errors_fatal 33277 1726883063.97198: done checking for any_errors_fatal 33277 1726883063.97198: checking for max_fail_percentage 33277 1726883063.97200: done checking for max_fail_percentage 33277 1726883063.97201: checking to see if all hosts have failed and the running result is not ok 33277 1726883063.97202: done checking to see if all hosts have failed 33277 1726883063.97202: getting the remaining hosts for this loop 33277 1726883063.97204: done getting the remaining hosts for this loop 33277 1726883063.97208: getting the next task for host managed_node2 33277 1726883063.97215: done getting next task for host managed_node2 33277 1726883063.97220: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 33277 1726883063.97225: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883063.97243: getting variables 33277 1726883063.97245: in VariableManager get_vars() 33277 1726883063.97304: Calling all_inventory to load vars for managed_node2 33277 1726883063.97307: Calling groups_inventory to load vars for managed_node2 33277 1726883063.97309: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883063.97631: Calling all_plugins_play to load vars for managed_node2 33277 1726883063.97636: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883063.97641: Calling groups_plugins_play to load vars for managed_node2 33277 1726883063.98177: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000001a 33277 1726883063.98181: WORKER PROCESS EXITING 33277 1726883063.98210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883063.98705: done with get_vars() 33277 1726883063.98832: done getting variables 33277 1726883063.98900: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:44:23 -0400 (0:00:00.058) 0:00:07.178 ****** 33277 1726883063.99040: entering _queue_task() for managed_node2/fail 33277 1726883063.99673: worker is 1 (out of 1 available) 33277 1726883063.99689: exiting _queue_task() for managed_node2/fail 33277 1726883063.99928: done queuing things up, now waiting for results queue to drain 33277 1726883063.99930: waiting for pending results... 33277 1726883064.00312: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 33277 1726883064.00601: in run() - task 0affc7ec-ae25-6628-6da4-00000000001b 33277 1726883064.00626: variable 'ansible_search_path' from source: unknown 33277 1726883064.00635: variable 'ansible_search_path' from source: unknown 33277 1726883064.00709: calling self._execute() 33277 1726883064.00932: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.01004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.01023: variable 'omit' from source: magic vars 33277 1726883064.01852: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.01871: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883064.02180: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.02265: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883064.02269: when evaluation is False, skipping this task 33277 1726883064.02272: _execute() done 33277 1726883064.02275: dumping result to json 33277 1726883064.02277: done dumping result, returning 33277 1726883064.02280: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affc7ec-ae25-6628-6da4-00000000001b] 33277 1726883064.02282: sending task result for task 0affc7ec-ae25-6628-6da4-00000000001b 33277 1726883064.02363: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000001b 33277 1726883064.02366: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883064.02577: no more pending results, returning what we have 33277 1726883064.02582: results queue empty 33277 1726883064.02584: checking for any_errors_fatal 33277 1726883064.02593: done checking for any_errors_fatal 33277 1726883064.02594: checking for max_fail_percentage 33277 1726883064.02596: done checking for max_fail_percentage 33277 1726883064.02597: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.02598: done checking to see if all hosts have failed 33277 1726883064.02599: getting the remaining hosts for this loop 33277 1726883064.02600: done getting the remaining hosts for this loop 33277 1726883064.02605: getting the next task for host managed_node2 33277 1726883064.02612: done getting next task for host managed_node2 33277 1726883064.02617: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 33277 1726883064.02620: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.02640: getting variables 33277 1726883064.02642: in VariableManager get_vars() 33277 1726883064.02703: Calling all_inventory to load vars for managed_node2 33277 1726883064.02706: Calling groups_inventory to load vars for managed_node2 33277 1726883064.02708: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.02998: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.03003: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.03008: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.03214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.03909: done with get_vars() 33277 1726883064.03924: done getting variables 33277 1726883064.04231: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:44:24 -0400 (0:00:00.052) 0:00:07.231 ****** 33277 1726883064.04266: entering _queue_task() for managed_node2/dnf 33277 1726883064.05005: worker is 1 (out of 1 available) 33277 1726883064.05018: exiting _queue_task() for managed_node2/dnf 33277 1726883064.05232: done queuing things up, now waiting for results queue to drain 33277 1726883064.05235: waiting for pending results... 33277 1726883064.05510: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 33277 1726883064.05941: in run() - task 0affc7ec-ae25-6628-6da4-00000000001c 33277 1726883064.05945: variable 'ansible_search_path' from source: unknown 33277 1726883064.05948: variable 'ansible_search_path' from source: unknown 33277 1726883064.06128: calling self._execute() 33277 1726883064.06243: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.06256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.06279: variable 'omit' from source: magic vars 33277 1726883064.07306: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.07374: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883064.07620: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.07670: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883064.07689: when evaluation is False, skipping this task 33277 1726883064.07715: _execute() done 33277 1726883064.07724: dumping result to json 33277 1726883064.07732: done dumping result, returning 33277 1726883064.07759: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affc7ec-ae25-6628-6da4-00000000001c] 33277 1726883064.07801: sending task result for task 0affc7ec-ae25-6628-6da4-00000000001c 33277 1726883064.08094: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000001c 33277 1726883064.08098: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883064.08169: no more pending results, returning what we have 33277 1726883064.08174: results queue empty 33277 1726883064.08175: checking for any_errors_fatal 33277 1726883064.08184: done checking for any_errors_fatal 33277 1726883064.08185: checking for max_fail_percentage 33277 1726883064.08189: done checking for max_fail_percentage 33277 1726883064.08190: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.08191: done checking to see if all hosts have failed 33277 1726883064.08191: getting the remaining hosts for this loop 33277 1726883064.08193: done getting the remaining hosts for this loop 33277 1726883064.08197: getting the next task for host managed_node2 33277 1726883064.08203: done getting next task for host managed_node2 33277 1726883064.08207: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 33277 1726883064.08210: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.08228: getting variables 33277 1726883064.08230: in VariableManager get_vars() 33277 1726883064.08650: Calling all_inventory to load vars for managed_node2 33277 1726883064.08654: Calling groups_inventory to load vars for managed_node2 33277 1726883064.08657: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.08671: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.08674: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.08677: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.09077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.09636: done with get_vars() 33277 1726883064.09650: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 33277 1726883064.09941: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:44:24 -0400 (0:00:00.057) 0:00:07.288 ****** 33277 1726883064.09975: entering _queue_task() for managed_node2/yum 33277 1726883064.09977: Creating lock for yum 33277 1726883064.10418: worker is 1 (out of 1 available) 33277 1726883064.10932: exiting _queue_task() for managed_node2/yum 33277 1726883064.10943: done queuing things up, now waiting for results queue to drain 33277 1726883064.10945: waiting for pending results... 33277 1726883064.11189: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 33277 1726883064.11353: in run() - task 0affc7ec-ae25-6628-6da4-00000000001d 33277 1726883064.11550: variable 'ansible_search_path' from source: unknown 33277 1726883064.11554: variable 'ansible_search_path' from source: unknown 33277 1726883064.11557: calling self._execute() 33277 1726883064.11703: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.11716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.11779: variable 'omit' from source: magic vars 33277 1726883064.12562: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.12581: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883064.13148: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.13376: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883064.13380: when evaluation is False, skipping this task 33277 1726883064.13383: _execute() done 33277 1726883064.13388: dumping result to json 33277 1726883064.13391: done dumping result, returning 33277 1726883064.13395: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affc7ec-ae25-6628-6da4-00000000001d] 33277 1726883064.13398: sending task result for task 0affc7ec-ae25-6628-6da4-00000000001d skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883064.13790: no more pending results, returning what we have 33277 1726883064.13794: results queue empty 33277 1726883064.13795: checking for any_errors_fatal 33277 1726883064.13803: done checking for any_errors_fatal 33277 1726883064.13804: checking for max_fail_percentage 33277 1726883064.13806: done checking for max_fail_percentage 33277 1726883064.13807: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.13808: done checking to see if all hosts have failed 33277 1726883064.13809: getting the remaining hosts for this loop 33277 1726883064.13810: done getting the remaining hosts for this loop 33277 1726883064.13815: getting the next task for host managed_node2 33277 1726883064.13824: done getting next task for host managed_node2 33277 1726883064.13828: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 33277 1726883064.13831: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.13849: getting variables 33277 1726883064.13851: in VariableManager get_vars() 33277 1726883064.13910: Calling all_inventory to load vars for managed_node2 33277 1726883064.13914: Calling groups_inventory to load vars for managed_node2 33277 1726883064.13916: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.14133: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.14137: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.14142: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.14829: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000001d 33277 1726883064.14832: WORKER PROCESS EXITING 33277 1726883064.14858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.15347: done with get_vars() 33277 1726883064.15359: done getting variables 33277 1726883064.15628: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:44:24 -0400 (0:00:00.056) 0:00:07.345 ****** 33277 1726883064.15666: entering _queue_task() for managed_node2/fail 33277 1726883064.16380: worker is 1 (out of 1 available) 33277 1726883064.16398: exiting _queue_task() for managed_node2/fail 33277 1726883064.16410: done queuing things up, now waiting for results queue to drain 33277 1726883064.16412: waiting for pending results... 33277 1726883064.16904: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 33277 1726883064.17202: in run() - task 0affc7ec-ae25-6628-6da4-00000000001e 33277 1726883064.17283: variable 'ansible_search_path' from source: unknown 33277 1726883064.17296: variable 'ansible_search_path' from source: unknown 33277 1726883064.17342: calling self._execute() 33277 1726883064.17617: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.17635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.17652: variable 'omit' from source: magic vars 33277 1726883064.18439: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.18503: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883064.18922: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.18927: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883064.18930: when evaluation is False, skipping this task 33277 1726883064.18932: _execute() done 33277 1726883064.18935: dumping result to json 33277 1726883064.18937: done dumping result, returning 33277 1726883064.18940: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-6628-6da4-00000000001e] 33277 1726883064.18943: sending task result for task 0affc7ec-ae25-6628-6da4-00000000001e skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883064.19083: no more pending results, returning what we have 33277 1726883064.19090: results queue empty 33277 1726883064.19092: checking for any_errors_fatal 33277 1726883064.19104: done checking for any_errors_fatal 33277 1726883064.19105: checking for max_fail_percentage 33277 1726883064.19107: done checking for max_fail_percentage 33277 1726883064.19107: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.19108: done checking to see if all hosts have failed 33277 1726883064.19109: getting the remaining hosts for this loop 33277 1726883064.19110: done getting the remaining hosts for this loop 33277 1726883064.19115: getting the next task for host managed_node2 33277 1726883064.19124: done getting next task for host managed_node2 33277 1726883064.19129: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 33277 1726883064.19132: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.19150: getting variables 33277 1726883064.19152: in VariableManager get_vars() 33277 1726883064.19210: Calling all_inventory to load vars for managed_node2 33277 1726883064.19214: Calling groups_inventory to load vars for managed_node2 33277 1726883064.19217: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.19534: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.19538: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.19543: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.20159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.20620: done with get_vars() 33277 1726883064.20633: done getting variables 33277 1726883064.20670: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000001e 33277 1726883064.20674: WORKER PROCESS EXITING 33277 1726883064.20715: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:44:24 -0400 (0:00:00.052) 0:00:07.397 ****** 33277 1726883064.20956: entering _queue_task() for managed_node2/package 33277 1726883064.21475: worker is 1 (out of 1 available) 33277 1726883064.21492: exiting _queue_task() for managed_node2/package 33277 1726883064.21505: done queuing things up, now waiting for results queue to drain 33277 1726883064.21507: waiting for pending results... 33277 1726883064.22477: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 33277 1726883064.22607: in run() - task 0affc7ec-ae25-6628-6da4-00000000001f 33277 1726883064.22648: variable 'ansible_search_path' from source: unknown 33277 1726883064.23010: variable 'ansible_search_path' from source: unknown 33277 1726883064.23014: calling self._execute() 33277 1726883064.23255: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.23268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.23288: variable 'omit' from source: magic vars 33277 1726883064.24196: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.24213: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883064.24525: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.24538: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883064.24545: when evaluation is False, skipping this task 33277 1726883064.24552: _execute() done 33277 1726883064.24560: dumping result to json 33277 1726883064.24567: done dumping result, returning 33277 1726883064.24581: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affc7ec-ae25-6628-6da4-00000000001f] 33277 1726883064.24810: sending task result for task 0affc7ec-ae25-6628-6da4-00000000001f 33277 1726883064.24900: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000001f 33277 1726883064.24904: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883064.24976: no more pending results, returning what we have 33277 1726883064.24982: results queue empty 33277 1726883064.24983: checking for any_errors_fatal 33277 1726883064.24994: done checking for any_errors_fatal 33277 1726883064.24995: checking for max_fail_percentage 33277 1726883064.24997: done checking for max_fail_percentage 33277 1726883064.24998: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.24999: done checking to see if all hosts have failed 33277 1726883064.24999: getting the remaining hosts for this loop 33277 1726883064.25001: done getting the remaining hosts for this loop 33277 1726883064.25006: getting the next task for host managed_node2 33277 1726883064.25013: done getting next task for host managed_node2 33277 1726883064.25017: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 33277 1726883064.25020: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.25039: getting variables 33277 1726883064.25041: in VariableManager get_vars() 33277 1726883064.25094: Calling all_inventory to load vars for managed_node2 33277 1726883064.25097: Calling groups_inventory to load vars for managed_node2 33277 1726883064.25099: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.25113: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.25115: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.25117: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.25971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.26630: done with get_vars() 33277 1726883064.26642: done getting variables 33277 1726883064.26708: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:44:24 -0400 (0:00:00.061) 0:00:07.459 ****** 33277 1726883064.27148: entering _queue_task() for managed_node2/package 33277 1726883064.27895: worker is 1 (out of 1 available) 33277 1726883064.27909: exiting _queue_task() for managed_node2/package 33277 1726883064.28325: done queuing things up, now waiting for results queue to drain 33277 1726883064.28328: waiting for pending results... 33277 1726883064.28638: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 33277 1726883064.28735: in run() - task 0affc7ec-ae25-6628-6da4-000000000020 33277 1726883064.28739: variable 'ansible_search_path' from source: unknown 33277 1726883064.28742: variable 'ansible_search_path' from source: unknown 33277 1726883064.28798: calling self._execute() 33277 1726883064.28917: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.28935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.28988: variable 'omit' from source: magic vars 33277 1726883064.29453: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.29496: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883064.29629: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.29715: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883064.29719: when evaluation is False, skipping this task 33277 1726883064.29724: _execute() done 33277 1726883064.29727: dumping result to json 33277 1726883064.29729: done dumping result, returning 33277 1726883064.29737: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affc7ec-ae25-6628-6da4-000000000020] 33277 1726883064.29740: sending task result for task 0affc7ec-ae25-6628-6da4-000000000020 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883064.29989: no more pending results, returning what we have 33277 1726883064.29994: results queue empty 33277 1726883064.29995: checking for any_errors_fatal 33277 1726883064.30001: done checking for any_errors_fatal 33277 1726883064.30001: checking for max_fail_percentage 33277 1726883064.30003: done checking for max_fail_percentage 33277 1726883064.30004: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.30004: done checking to see if all hosts have failed 33277 1726883064.30005: getting the remaining hosts for this loop 33277 1726883064.30007: done getting the remaining hosts for this loop 33277 1726883064.30011: getting the next task for host managed_node2 33277 1726883064.30017: done getting next task for host managed_node2 33277 1726883064.30020: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 33277 1726883064.30025: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.30040: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000020 33277 1726883064.30044: WORKER PROCESS EXITING 33277 1726883064.30129: getting variables 33277 1726883064.30131: in VariableManager get_vars() 33277 1726883064.30176: Calling all_inventory to load vars for managed_node2 33277 1726883064.30179: Calling groups_inventory to load vars for managed_node2 33277 1726883064.30181: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.30193: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.30196: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.30200: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.30603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.30903: done with get_vars() 33277 1726883064.30915: done getting variables 33277 1726883064.31019: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:44:24 -0400 (0:00:00.039) 0:00:07.498 ****** 33277 1726883064.31058: entering _queue_task() for managed_node2/package 33277 1726883064.31369: worker is 1 (out of 1 available) 33277 1726883064.31384: exiting _queue_task() for managed_node2/package 33277 1726883064.31400: done queuing things up, now waiting for results queue to drain 33277 1726883064.31402: waiting for pending results... 33277 1726883064.31699: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 33277 1726883064.32129: in run() - task 0affc7ec-ae25-6628-6da4-000000000021 33277 1726883064.32134: variable 'ansible_search_path' from source: unknown 33277 1726883064.32137: variable 'ansible_search_path' from source: unknown 33277 1726883064.32149: calling self._execute() 33277 1726883064.32483: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.32490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.32493: variable 'omit' from source: magic vars 33277 1726883064.33294: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.33312: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883064.33475: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.33491: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883064.33500: when evaluation is False, skipping this task 33277 1726883064.33507: _execute() done 33277 1726883064.33514: dumping result to json 33277 1726883064.33525: done dumping result, returning 33277 1726883064.33578: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affc7ec-ae25-6628-6da4-000000000021] 33277 1726883064.33593: sending task result for task 0affc7ec-ae25-6628-6da4-000000000021 33277 1726883064.33754: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000021 33277 1726883064.33757: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883064.33831: no more pending results, returning what we have 33277 1726883064.33836: results queue empty 33277 1726883064.33837: checking for any_errors_fatal 33277 1726883064.33846: done checking for any_errors_fatal 33277 1726883064.33847: checking for max_fail_percentage 33277 1726883064.33848: done checking for max_fail_percentage 33277 1726883064.33849: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.33849: done checking to see if all hosts have failed 33277 1726883064.33850: getting the remaining hosts for this loop 33277 1726883064.33851: done getting the remaining hosts for this loop 33277 1726883064.33856: getting the next task for host managed_node2 33277 1726883064.33863: done getting next task for host managed_node2 33277 1726883064.33866: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 33277 1726883064.33869: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.33888: getting variables 33277 1726883064.33890: in VariableManager get_vars() 33277 1726883064.33947: Calling all_inventory to load vars for managed_node2 33277 1726883064.33951: Calling groups_inventory to load vars for managed_node2 33277 1726883064.33953: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.33967: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.33970: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.33973: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.34449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.34669: done with get_vars() 33277 1726883064.34678: done getting variables 33277 1726883064.34783: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:44:24 -0400 (0:00:00.037) 0:00:07.536 ****** 33277 1726883064.34818: entering _queue_task() for managed_node2/service 33277 1726883064.34820: Creating lock for service 33277 1726883064.35100: worker is 1 (out of 1 available) 33277 1726883064.35113: exiting _queue_task() for managed_node2/service 33277 1726883064.35127: done queuing things up, now waiting for results queue to drain 33277 1726883064.35129: waiting for pending results... 33277 1726883064.35544: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 33277 1726883064.35549: in run() - task 0affc7ec-ae25-6628-6da4-000000000022 33277 1726883064.35567: variable 'ansible_search_path' from source: unknown 33277 1726883064.35574: variable 'ansible_search_path' from source: unknown 33277 1726883064.35618: calling self._execute() 33277 1726883064.35709: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.35721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.35737: variable 'omit' from source: magic vars 33277 1726883064.36139: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.36155: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883064.36281: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.36301: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883064.36309: when evaluation is False, skipping this task 33277 1726883064.36316: _execute() done 33277 1726883064.36326: dumping result to json 33277 1726883064.36334: done dumping result, returning 33277 1726883064.36346: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-6628-6da4-000000000022] 33277 1726883064.36355: sending task result for task 0affc7ec-ae25-6628-6da4-000000000022 33277 1726883064.36628: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000022 33277 1726883064.36632: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883064.36672: no more pending results, returning what we have 33277 1726883064.36676: results queue empty 33277 1726883064.36677: checking for any_errors_fatal 33277 1726883064.36682: done checking for any_errors_fatal 33277 1726883064.36683: checking for max_fail_percentage 33277 1726883064.36684: done checking for max_fail_percentage 33277 1726883064.36687: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.36688: done checking to see if all hosts have failed 33277 1726883064.36689: getting the remaining hosts for this loop 33277 1726883064.36690: done getting the remaining hosts for this loop 33277 1726883064.36694: getting the next task for host managed_node2 33277 1726883064.36700: done getting next task for host managed_node2 33277 1726883064.36703: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 33277 1726883064.36707: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.36721: getting variables 33277 1726883064.36724: in VariableManager get_vars() 33277 1726883064.36768: Calling all_inventory to load vars for managed_node2 33277 1726883064.36771: Calling groups_inventory to load vars for managed_node2 33277 1726883064.36773: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.36783: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.36788: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.36792: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.37070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.37391: done with get_vars() 33277 1726883064.37573: done getting variables 33277 1726883064.37636: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:44:24 -0400 (0:00:00.028) 0:00:07.565 ****** 33277 1726883064.37667: entering _queue_task() for managed_node2/service 33277 1726883064.38333: worker is 1 (out of 1 available) 33277 1726883064.38346: exiting _queue_task() for managed_node2/service 33277 1726883064.38357: done queuing things up, now waiting for results queue to drain 33277 1726883064.38358: waiting for pending results... 33277 1726883064.38738: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 33277 1726883064.39230: in run() - task 0affc7ec-ae25-6628-6da4-000000000023 33277 1726883064.39235: variable 'ansible_search_path' from source: unknown 33277 1726883064.39238: variable 'ansible_search_path' from source: unknown 33277 1726883064.39241: calling self._execute() 33277 1726883064.39243: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.39349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.39366: variable 'omit' from source: magic vars 33277 1726883064.40200: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.40218: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883064.40393: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.40467: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883064.40476: when evaluation is False, skipping this task 33277 1726883064.40483: _execute() done 33277 1726883064.40498: dumping result to json 33277 1726883064.40506: done dumping result, returning 33277 1726883064.40559: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affc7ec-ae25-6628-6da4-000000000023] 33277 1726883064.40569: sending task result for task 0affc7ec-ae25-6628-6da4-000000000023 33277 1726883064.40695: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000023 33277 1726883064.40703: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33277 1726883064.40761: no more pending results, returning what we have 33277 1726883064.40765: results queue empty 33277 1726883064.40766: checking for any_errors_fatal 33277 1726883064.40775: done checking for any_errors_fatal 33277 1726883064.40776: checking for max_fail_percentage 33277 1726883064.40778: done checking for max_fail_percentage 33277 1726883064.40778: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.40779: done checking to see if all hosts have failed 33277 1726883064.40780: getting the remaining hosts for this loop 33277 1726883064.40781: done getting the remaining hosts for this loop 33277 1726883064.40788: getting the next task for host managed_node2 33277 1726883064.40795: done getting next task for host managed_node2 33277 1726883064.40798: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 33277 1726883064.40802: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.40818: getting variables 33277 1726883064.40820: in VariableManager get_vars() 33277 1726883064.40880: Calling all_inventory to load vars for managed_node2 33277 1726883064.40883: Calling groups_inventory to load vars for managed_node2 33277 1726883064.40889: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.40904: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.40907: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.40910: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.41410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.41826: done with get_vars() 33277 1726883064.41839: done getting variables 33277 1726883064.41909: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:44:24 -0400 (0:00:00.042) 0:00:07.607 ****** 33277 1726883064.41948: entering _queue_task() for managed_node2/service 33277 1726883064.42361: worker is 1 (out of 1 available) 33277 1726883064.42373: exiting _queue_task() for managed_node2/service 33277 1726883064.42383: done queuing things up, now waiting for results queue to drain 33277 1726883064.42388: waiting for pending results... 33277 1726883064.42743: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 33277 1726883064.43110: in run() - task 0affc7ec-ae25-6628-6da4-000000000024 33277 1726883064.43251: variable 'ansible_search_path' from source: unknown 33277 1726883064.43328: variable 'ansible_search_path' from source: unknown 33277 1726883064.43332: calling self._execute() 33277 1726883064.43473: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.43702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.43707: variable 'omit' from source: magic vars 33277 1726883064.44462: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.44489: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883064.44729: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.44818: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883064.44829: when evaluation is False, skipping this task 33277 1726883064.44838: _execute() done 33277 1726883064.44909: dumping result to json 33277 1726883064.44924: done dumping result, returning 33277 1726883064.44939: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affc7ec-ae25-6628-6da4-000000000024] 33277 1726883064.44950: sending task result for task 0affc7ec-ae25-6628-6da4-000000000024 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883064.45278: no more pending results, returning what we have 33277 1726883064.45282: results queue empty 33277 1726883064.45284: checking for any_errors_fatal 33277 1726883064.45294: done checking for any_errors_fatal 33277 1726883064.45295: checking for max_fail_percentage 33277 1726883064.45297: done checking for max_fail_percentage 33277 1726883064.45297: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.45298: done checking to see if all hosts have failed 33277 1726883064.45299: getting the remaining hosts for this loop 33277 1726883064.45301: done getting the remaining hosts for this loop 33277 1726883064.45306: getting the next task for host managed_node2 33277 1726883064.45313: done getting next task for host managed_node2 33277 1726883064.45317: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 33277 1726883064.45320: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.45339: getting variables 33277 1726883064.45341: in VariableManager get_vars() 33277 1726883064.45405: Calling all_inventory to load vars for managed_node2 33277 1726883064.45409: Calling groups_inventory to load vars for managed_node2 33277 1726883064.45412: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.45496: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.45500: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.45506: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000024 33277 1726883064.45509: WORKER PROCESS EXITING 33277 1726883064.45513: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.46492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.47173: done with get_vars() 33277 1726883064.47190: done getting variables 33277 1726883064.47665: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:44:24 -0400 (0:00:00.057) 0:00:07.665 ****** 33277 1726883064.47705: entering _queue_task() for managed_node2/service 33277 1726883064.49065: worker is 1 (out of 1 available) 33277 1726883064.49078: exiting _queue_task() for managed_node2/service 33277 1726883064.49091: done queuing things up, now waiting for results queue to drain 33277 1726883064.49093: waiting for pending results... 33277 1726883064.49259: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 33277 1726883064.49699: in run() - task 0affc7ec-ae25-6628-6da4-000000000025 33277 1726883064.49727: variable 'ansible_search_path' from source: unknown 33277 1726883064.49766: variable 'ansible_search_path' from source: unknown 33277 1726883064.49928: calling self._execute() 33277 1726883064.49971: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.49983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.49999: variable 'omit' from source: magic vars 33277 1726883064.50966: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.50970: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883064.51231: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.51235: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883064.51237: when evaluation is False, skipping this task 33277 1726883064.51240: _execute() done 33277 1726883064.51243: dumping result to json 33277 1726883064.51245: done dumping result, returning 33277 1726883064.51318: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affc7ec-ae25-6628-6da4-000000000025] 33277 1726883064.51410: sending task result for task 0affc7ec-ae25-6628-6da4-000000000025 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33277 1726883064.51575: no more pending results, returning what we have 33277 1726883064.51579: results queue empty 33277 1726883064.51581: checking for any_errors_fatal 33277 1726883064.51590: done checking for any_errors_fatal 33277 1726883064.51591: checking for max_fail_percentage 33277 1726883064.51593: done checking for max_fail_percentage 33277 1726883064.51594: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.51595: done checking to see if all hosts have failed 33277 1726883064.51595: getting the remaining hosts for this loop 33277 1726883064.51597: done getting the remaining hosts for this loop 33277 1726883064.51602: getting the next task for host managed_node2 33277 1726883064.51609: done getting next task for host managed_node2 33277 1726883064.51613: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 33277 1726883064.51616: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.51636: getting variables 33277 1726883064.51638: in VariableManager get_vars() 33277 1726883064.51699: Calling all_inventory to load vars for managed_node2 33277 1726883064.51703: Calling groups_inventory to load vars for managed_node2 33277 1726883064.51705: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.51719: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.52127: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.52134: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.52630: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000025 33277 1726883064.52634: WORKER PROCESS EXITING 33277 1726883064.52659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.53190: done with get_vars() 33277 1726883064.53204: done getting variables 33277 1726883064.53278: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:44:24 -0400 (0:00:00.056) 0:00:07.721 ****** 33277 1726883064.53317: entering _queue_task() for managed_node2/copy 33277 1726883064.54289: worker is 1 (out of 1 available) 33277 1726883064.54301: exiting _queue_task() for managed_node2/copy 33277 1726883064.54312: done queuing things up, now waiting for results queue to drain 33277 1726883064.54314: waiting for pending results... 33277 1726883064.54548: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 33277 1726883064.54736: in run() - task 0affc7ec-ae25-6628-6da4-000000000026 33277 1726883064.54757: variable 'ansible_search_path' from source: unknown 33277 1726883064.54765: variable 'ansible_search_path' from source: unknown 33277 1726883064.54815: calling self._execute() 33277 1726883064.54914: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.54930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.54944: variable 'omit' from source: magic vars 33277 1726883064.55356: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.55373: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883064.55511: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.55524: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883064.55531: when evaluation is False, skipping this task 33277 1726883064.55538: _execute() done 33277 1726883064.55543: dumping result to json 33277 1726883064.55550: done dumping result, returning 33277 1726883064.55567: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affc7ec-ae25-6628-6da4-000000000026] 33277 1726883064.55577: sending task result for task 0affc7ec-ae25-6628-6da4-000000000026 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883064.55742: no more pending results, returning what we have 33277 1726883064.55746: results queue empty 33277 1726883064.55748: checking for any_errors_fatal 33277 1726883064.55756: done checking for any_errors_fatal 33277 1726883064.55756: checking for max_fail_percentage 33277 1726883064.55758: done checking for max_fail_percentage 33277 1726883064.55759: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.55760: done checking to see if all hosts have failed 33277 1726883064.55761: getting the remaining hosts for this loop 33277 1726883064.55762: done getting the remaining hosts for this loop 33277 1726883064.55767: getting the next task for host managed_node2 33277 1726883064.55773: done getting next task for host managed_node2 33277 1726883064.55777: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 33277 1726883064.55780: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.55801: getting variables 33277 1726883064.55803: in VariableManager get_vars() 33277 1726883064.55862: Calling all_inventory to load vars for managed_node2 33277 1726883064.55866: Calling groups_inventory to load vars for managed_node2 33277 1726883064.55868: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.56059: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000026 33277 1726883064.56067: WORKER PROCESS EXITING 33277 1726883064.56278: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.56281: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.56376: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.57121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.57578: done with get_vars() 33277 1726883064.57592: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:44:24 -0400 (0:00:00.043) 0:00:07.765 ****** 33277 1726883064.57706: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 33277 1726883064.57712: Creating lock for fedora.linux_system_roles.network_connections 33277 1726883064.58690: worker is 1 (out of 1 available) 33277 1726883064.58705: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 33277 1726883064.58718: done queuing things up, now waiting for results queue to drain 33277 1726883064.58719: waiting for pending results... 33277 1726883064.59371: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 33277 1726883064.59911: in run() - task 0affc7ec-ae25-6628-6da4-000000000027 33277 1726883064.59916: variable 'ansible_search_path' from source: unknown 33277 1726883064.59919: variable 'ansible_search_path' from source: unknown 33277 1726883064.60096: calling self._execute() 33277 1726883064.60242: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.60252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.60302: variable 'omit' from source: magic vars 33277 1726883064.61368: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.61372: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883064.61929: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.61932: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883064.61934: when evaluation is False, skipping this task 33277 1726883064.61937: _execute() done 33277 1726883064.61939: dumping result to json 33277 1726883064.61941: done dumping result, returning 33277 1726883064.61944: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affc7ec-ae25-6628-6da4-000000000027] 33277 1726883064.61946: sending task result for task 0affc7ec-ae25-6628-6da4-000000000027 33277 1726883064.62035: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000027 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883064.62093: no more pending results, returning what we have 33277 1726883064.62098: results queue empty 33277 1726883064.62099: checking for any_errors_fatal 33277 1726883064.62105: done checking for any_errors_fatal 33277 1726883064.62106: checking for max_fail_percentage 33277 1726883064.62108: done checking for max_fail_percentage 33277 1726883064.62108: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.62109: done checking to see if all hosts have failed 33277 1726883064.62110: getting the remaining hosts for this loop 33277 1726883064.62111: done getting the remaining hosts for this loop 33277 1726883064.62117: getting the next task for host managed_node2 33277 1726883064.62125: done getting next task for host managed_node2 33277 1726883064.62129: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 33277 1726883064.62132: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.62148: getting variables 33277 1726883064.62150: in VariableManager get_vars() 33277 1726883064.62208: Calling all_inventory to load vars for managed_node2 33277 1726883064.62211: Calling groups_inventory to load vars for managed_node2 33277 1726883064.62214: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.62428: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.62432: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.62437: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.62888: WORKER PROCESS EXITING 33277 1726883064.62908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.63599: done with get_vars() 33277 1726883064.63610: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:44:24 -0400 (0:00:00.061) 0:00:07.827 ****** 33277 1726883064.63869: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 33277 1726883064.63871: Creating lock for fedora.linux_system_roles.network_state 33277 1726883064.64366: worker is 1 (out of 1 available) 33277 1726883064.64380: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 33277 1726883064.64392: done queuing things up, now waiting for results queue to drain 33277 1726883064.64397: waiting for pending results... 33277 1726883064.64847: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 33277 1726883064.65169: in run() - task 0affc7ec-ae25-6628-6da4-000000000028 33277 1726883064.65452: variable 'ansible_search_path' from source: unknown 33277 1726883064.65466: variable 'ansible_search_path' from source: unknown 33277 1726883064.65562: calling self._execute() 33277 1726883064.65840: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.65874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.65892: variable 'omit' from source: magic vars 33277 1726883064.66690: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.66708: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883064.66965: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.66980: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883064.67009: when evaluation is False, skipping this task 33277 1726883064.67017: _execute() done 33277 1726883064.67025: dumping result to json 33277 1726883064.67032: done dumping result, returning 33277 1726883064.67043: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affc7ec-ae25-6628-6da4-000000000028] 33277 1726883064.67106: sending task result for task 0affc7ec-ae25-6628-6da4-000000000028 33277 1726883064.67194: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000028 33277 1726883064.67197: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883064.67261: no more pending results, returning what we have 33277 1726883064.67271: results queue empty 33277 1726883064.67272: checking for any_errors_fatal 33277 1726883064.67282: done checking for any_errors_fatal 33277 1726883064.67283: checking for max_fail_percentage 33277 1726883064.67284: done checking for max_fail_percentage 33277 1726883064.67287: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.67288: done checking to see if all hosts have failed 33277 1726883064.67289: getting the remaining hosts for this loop 33277 1726883064.67290: done getting the remaining hosts for this loop 33277 1726883064.67295: getting the next task for host managed_node2 33277 1726883064.67301: done getting next task for host managed_node2 33277 1726883064.67306: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 33277 1726883064.67309: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.67327: getting variables 33277 1726883064.67329: in VariableManager get_vars() 33277 1726883064.67382: Calling all_inventory to load vars for managed_node2 33277 1726883064.67388: Calling groups_inventory to load vars for managed_node2 33277 1726883064.67393: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.67406: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.67409: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.67412: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.67931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.68206: done with get_vars() 33277 1726883064.68216: done getting variables 33277 1726883064.68275: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:44:24 -0400 (0:00:00.044) 0:00:07.871 ****** 33277 1726883064.68311: entering _queue_task() for managed_node2/debug 33277 1726883064.68694: worker is 1 (out of 1 available) 33277 1726883064.68708: exiting _queue_task() for managed_node2/debug 33277 1726883064.68718: done queuing things up, now waiting for results queue to drain 33277 1726883064.68719: waiting for pending results... 33277 1726883064.69241: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 33277 1726883064.69301: in run() - task 0affc7ec-ae25-6628-6da4-000000000029 33277 1726883064.69323: variable 'ansible_search_path' from source: unknown 33277 1726883064.69338: variable 'ansible_search_path' from source: unknown 33277 1726883064.69447: calling self._execute() 33277 1726883064.69457: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.69469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.69482: variable 'omit' from source: magic vars 33277 1726883064.69877: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.69895: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883064.70025: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.70100: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883064.70104: when evaluation is False, skipping this task 33277 1726883064.70107: _execute() done 33277 1726883064.70109: dumping result to json 33277 1726883064.70111: done dumping result, returning 33277 1726883064.70114: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affc7ec-ae25-6628-6da4-000000000029] 33277 1726883064.70116: sending task result for task 0affc7ec-ae25-6628-6da4-000000000029 33277 1726883064.70187: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000029 33277 1726883064.70189: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "ansible_distribution_major_version == '7'" } 33277 1726883064.70251: no more pending results, returning what we have 33277 1726883064.70254: results queue empty 33277 1726883064.70255: checking for any_errors_fatal 33277 1726883064.70262: done checking for any_errors_fatal 33277 1726883064.70262: checking for max_fail_percentage 33277 1726883064.70264: done checking for max_fail_percentage 33277 1726883064.70265: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.70265: done checking to see if all hosts have failed 33277 1726883064.70266: getting the remaining hosts for this loop 33277 1726883064.70268: done getting the remaining hosts for this loop 33277 1726883064.70272: getting the next task for host managed_node2 33277 1726883064.70546: done getting next task for host managed_node2 33277 1726883064.70551: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 33277 1726883064.70554: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.70567: getting variables 33277 1726883064.70569: in VariableManager get_vars() 33277 1726883064.70612: Calling all_inventory to load vars for managed_node2 33277 1726883064.70615: Calling groups_inventory to load vars for managed_node2 33277 1726883064.70618: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.70648: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.70652: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.70655: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.70973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.71264: done with get_vars() 33277 1726883064.71273: done getting variables 33277 1726883064.71333: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:44:24 -0400 (0:00:00.030) 0:00:07.902 ****** 33277 1726883064.71386: entering _queue_task() for managed_node2/debug 33277 1726883064.71700: worker is 1 (out of 1 available) 33277 1726883064.71712: exiting _queue_task() for managed_node2/debug 33277 1726883064.71910: done queuing things up, now waiting for results queue to drain 33277 1726883064.71912: waiting for pending results... 33277 1726883064.72277: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 33277 1726883064.72529: in run() - task 0affc7ec-ae25-6628-6da4-00000000002a 33277 1726883064.72532: variable 'ansible_search_path' from source: unknown 33277 1726883064.72534: variable 'ansible_search_path' from source: unknown 33277 1726883064.72537: calling self._execute() 33277 1726883064.72540: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.72580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.72599: variable 'omit' from source: magic vars 33277 1726883064.73455: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.73472: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883064.73677: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.73692: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883064.73699: when evaluation is False, skipping this task 33277 1726883064.73705: _execute() done 33277 1726883064.73712: dumping result to json 33277 1726883064.73763: done dumping result, returning 33277 1726883064.73819: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affc7ec-ae25-6628-6da4-00000000002a] 33277 1726883064.73879: sending task result for task 0affc7ec-ae25-6628-6da4-00000000002a 33277 1726883064.73952: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000002a 33277 1726883064.73955: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "ansible_distribution_major_version == '7'" } 33277 1726883064.74035: no more pending results, returning what we have 33277 1726883064.74040: results queue empty 33277 1726883064.74042: checking for any_errors_fatal 33277 1726883064.74052: done checking for any_errors_fatal 33277 1726883064.74053: checking for max_fail_percentage 33277 1726883064.74054: done checking for max_fail_percentage 33277 1726883064.74055: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.74056: done checking to see if all hosts have failed 33277 1726883064.74057: getting the remaining hosts for this loop 33277 1726883064.74058: done getting the remaining hosts for this loop 33277 1726883064.74062: getting the next task for host managed_node2 33277 1726883064.74069: done getting next task for host managed_node2 33277 1726883064.74073: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 33277 1726883064.74076: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.74094: getting variables 33277 1726883064.74096: in VariableManager get_vars() 33277 1726883064.74149: Calling all_inventory to load vars for managed_node2 33277 1726883064.74152: Calling groups_inventory to load vars for managed_node2 33277 1726883064.74154: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.74173: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.74176: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.74179: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.74964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.75866: done with get_vars() 33277 1726883064.75882: done getting variables 33277 1726883064.75991: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:44:24 -0400 (0:00:00.046) 0:00:07.948 ****** 33277 1726883064.76049: entering _queue_task() for managed_node2/debug 33277 1726883064.76579: worker is 1 (out of 1 available) 33277 1726883064.76595: exiting _queue_task() for managed_node2/debug 33277 1726883064.76608: done queuing things up, now waiting for results queue to drain 33277 1726883064.76609: waiting for pending results... 33277 1726883064.77346: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 33277 1726883064.77630: in run() - task 0affc7ec-ae25-6628-6da4-00000000002b 33277 1726883064.78332: variable 'ansible_search_path' from source: unknown 33277 1726883064.78336: variable 'ansible_search_path' from source: unknown 33277 1726883064.78339: calling self._execute() 33277 1726883064.78378: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.78393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.78409: variable 'omit' from source: magic vars 33277 1726883064.79375: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.79404: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883064.79671: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.79752: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883064.79769: when evaluation is False, skipping this task 33277 1726883064.79776: _execute() done 33277 1726883064.79783: dumping result to json 33277 1726883064.79792: done dumping result, returning 33277 1726883064.79805: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affc7ec-ae25-6628-6da4-00000000002b] 33277 1726883064.79835: sending task result for task 0affc7ec-ae25-6628-6da4-00000000002b 33277 1726883064.80132: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000002b 33277 1726883064.80136: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "ansible_distribution_major_version == '7'" } 33277 1726883064.80212: no more pending results, returning what we have 33277 1726883064.80216: results queue empty 33277 1726883064.80218: checking for any_errors_fatal 33277 1726883064.80226: done checking for any_errors_fatal 33277 1726883064.80227: checking for max_fail_percentage 33277 1726883064.80229: done checking for max_fail_percentage 33277 1726883064.80230: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.80231: done checking to see if all hosts have failed 33277 1726883064.80232: getting the remaining hosts for this loop 33277 1726883064.80233: done getting the remaining hosts for this loop 33277 1726883064.80237: getting the next task for host managed_node2 33277 1726883064.80244: done getting next task for host managed_node2 33277 1726883064.80248: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 33277 1726883064.80252: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.80267: getting variables 33277 1726883064.80270: in VariableManager get_vars() 33277 1726883064.80442: Calling all_inventory to load vars for managed_node2 33277 1726883064.80445: Calling groups_inventory to load vars for managed_node2 33277 1726883064.80448: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.80461: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.80463: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.80467: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.81756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.82315: done with get_vars() 33277 1726883064.82327: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:44:24 -0400 (0:00:00.065) 0:00:08.014 ****** 33277 1726883064.82592: entering _queue_task() for managed_node2/ping 33277 1726883064.82594: Creating lock for ping 33277 1726883064.83352: worker is 1 (out of 1 available) 33277 1726883064.83365: exiting _queue_task() for managed_node2/ping 33277 1726883064.83376: done queuing things up, now waiting for results queue to drain 33277 1726883064.83377: waiting for pending results... 33277 1726883064.83978: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 33277 1726883064.84056: in run() - task 0affc7ec-ae25-6628-6da4-00000000002c 33277 1726883064.84284: variable 'ansible_search_path' from source: unknown 33277 1726883064.84288: variable 'ansible_search_path' from source: unknown 33277 1726883064.84291: calling self._execute() 33277 1726883064.84448: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.84462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.84476: variable 'omit' from source: magic vars 33277 1726883064.85353: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.85492: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883064.85736: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.85770: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883064.85777: when evaluation is False, skipping this task 33277 1726883064.85810: _execute() done 33277 1726883064.85834: dumping result to json 33277 1726883064.85843: done dumping result, returning 33277 1726883064.86021: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affc7ec-ae25-6628-6da4-00000000002c] 33277 1726883064.86026: sending task result for task 0affc7ec-ae25-6628-6da4-00000000002c 33277 1726883064.86107: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000002c 33277 1726883064.86112: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883064.86174: no more pending results, returning what we have 33277 1726883064.86178: results queue empty 33277 1726883064.86179: checking for any_errors_fatal 33277 1726883064.86190: done checking for any_errors_fatal 33277 1726883064.86191: checking for max_fail_percentage 33277 1726883064.86193: done checking for max_fail_percentage 33277 1726883064.86193: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.86194: done checking to see if all hosts have failed 33277 1726883064.86195: getting the remaining hosts for this loop 33277 1726883064.86197: done getting the remaining hosts for this loop 33277 1726883064.86202: getting the next task for host managed_node2 33277 1726883064.86211: done getting next task for host managed_node2 33277 1726883064.86214: ^ task is: TASK: meta (role_complete) 33277 1726883064.86217: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.86243: getting variables 33277 1726883064.86245: in VariableManager get_vars() 33277 1726883064.86504: Calling all_inventory to load vars for managed_node2 33277 1726883064.86507: Calling groups_inventory to load vars for managed_node2 33277 1726883064.86510: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.86519: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.86523: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.86616: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.87104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.87906: done with get_vars() 33277 1726883064.87923: done getting variables 33277 1726883064.88013: done queuing things up, now waiting for results queue to drain 33277 1726883064.88015: results queue empty 33277 1726883064.88015: checking for any_errors_fatal 33277 1726883064.88020: done checking for any_errors_fatal 33277 1726883064.88020: checking for max_fail_percentage 33277 1726883064.88022: done checking for max_fail_percentage 33277 1726883064.88022: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.88023: done checking to see if all hosts have failed 33277 1726883064.88024: getting the remaining hosts for this loop 33277 1726883064.88025: done getting the remaining hosts for this loop 33277 1726883064.88030: getting the next task for host managed_node2 33277 1726883064.88035: done getting next task for host managed_node2 33277 1726883064.88038: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 33277 1726883064.88041: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.88051: getting variables 33277 1726883064.88053: in VariableManager get_vars() 33277 1726883064.88074: Calling all_inventory to load vars for managed_node2 33277 1726883064.88076: Calling groups_inventory to load vars for managed_node2 33277 1726883064.88078: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.88083: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.88087: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.88091: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.88518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.88794: done with get_vars() 33277 1726883064.88803: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:44:24 -0400 (0:00:00.062) 0:00:08.077 ****** 33277 1726883064.88895: entering _queue_task() for managed_node2/include_tasks 33277 1726883064.89344: worker is 1 (out of 1 available) 33277 1726883064.89355: exiting _queue_task() for managed_node2/include_tasks 33277 1726883064.89365: done queuing things up, now waiting for results queue to drain 33277 1726883064.89366: waiting for pending results... 33277 1726883064.89852: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 33277 1726883064.89857: in run() - task 0affc7ec-ae25-6628-6da4-000000000063 33277 1726883064.89860: variable 'ansible_search_path' from source: unknown 33277 1726883064.89863: variable 'ansible_search_path' from source: unknown 33277 1726883064.89865: calling self._execute() 33277 1726883064.89891: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.89900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.89908: variable 'omit' from source: magic vars 33277 1726883064.90351: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.90368: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883064.90710: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.90713: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883064.90716: when evaluation is False, skipping this task 33277 1726883064.90718: _execute() done 33277 1726883064.90721: dumping result to json 33277 1726883064.90725: done dumping result, returning 33277 1726883064.90727: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affc7ec-ae25-6628-6da4-000000000063] 33277 1726883064.90730: sending task result for task 0affc7ec-ae25-6628-6da4-000000000063 33277 1726883064.90800: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000063 33277 1726883064.90803: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883064.90849: no more pending results, returning what we have 33277 1726883064.90853: results queue empty 33277 1726883064.90854: checking for any_errors_fatal 33277 1726883064.90855: done checking for any_errors_fatal 33277 1726883064.90856: checking for max_fail_percentage 33277 1726883064.90858: done checking for max_fail_percentage 33277 1726883064.90858: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.90859: done checking to see if all hosts have failed 33277 1726883064.90860: getting the remaining hosts for this loop 33277 1726883064.90861: done getting the remaining hosts for this loop 33277 1726883064.90864: getting the next task for host managed_node2 33277 1726883064.90869: done getting next task for host managed_node2 33277 1726883064.90873: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 33277 1726883064.90875: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.90895: getting variables 33277 1726883064.90896: in VariableManager get_vars() 33277 1726883064.90941: Calling all_inventory to load vars for managed_node2 33277 1726883064.90944: Calling groups_inventory to load vars for managed_node2 33277 1726883064.90946: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.90971: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.90975: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.90979: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.91194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.91455: done with get_vars() 33277 1726883064.91465: done getting variables 33277 1726883064.91536: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:44:24 -0400 (0:00:00.026) 0:00:08.104 ****** 33277 1726883064.91569: entering _queue_task() for managed_node2/debug 33277 1726883064.91949: worker is 1 (out of 1 available) 33277 1726883064.91960: exiting _queue_task() for managed_node2/debug 33277 1726883064.91970: done queuing things up, now waiting for results queue to drain 33277 1726883064.91972: waiting for pending results... 33277 1726883064.92179: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 33277 1726883064.92421: in run() - task 0affc7ec-ae25-6628-6da4-000000000064 33277 1726883064.92426: variable 'ansible_search_path' from source: unknown 33277 1726883064.92429: variable 'ansible_search_path' from source: unknown 33277 1726883064.92432: calling self._execute() 33277 1726883064.92549: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.92554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.92565: variable 'omit' from source: magic vars 33277 1726883064.93505: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.93518: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883064.93796: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.93929: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883064.93933: when evaluation is False, skipping this task 33277 1726883064.93935: _execute() done 33277 1726883064.93938: dumping result to json 33277 1726883064.93941: done dumping result, returning 33277 1726883064.93944: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affc7ec-ae25-6628-6da4-000000000064] 33277 1726883064.93947: sending task result for task 0affc7ec-ae25-6628-6da4-000000000064 33277 1726883064.94191: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000064 33277 1726883064.94195: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "ansible_distribution_major_version == '7'" } 33277 1726883064.94235: no more pending results, returning what we have 33277 1726883064.94238: results queue empty 33277 1726883064.94239: checking for any_errors_fatal 33277 1726883064.94244: done checking for any_errors_fatal 33277 1726883064.94245: checking for max_fail_percentage 33277 1726883064.94247: done checking for max_fail_percentage 33277 1726883064.94248: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.94249: done checking to see if all hosts have failed 33277 1726883064.94250: getting the remaining hosts for this loop 33277 1726883064.94251: done getting the remaining hosts for this loop 33277 1726883064.94254: getting the next task for host managed_node2 33277 1726883064.94259: done getting next task for host managed_node2 33277 1726883064.94263: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 33277 1726883064.94266: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.94283: getting variables 33277 1726883064.94284: in VariableManager get_vars() 33277 1726883064.94329: Calling all_inventory to load vars for managed_node2 33277 1726883064.94332: Calling groups_inventory to load vars for managed_node2 33277 1726883064.94334: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.94343: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.94345: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.94348: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.94673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.94945: done with get_vars() 33277 1726883064.94956: done getting variables 33277 1726883064.95026: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:44:24 -0400 (0:00:00.034) 0:00:08.138 ****** 33277 1726883064.95061: entering _queue_task() for managed_node2/fail 33277 1726883064.95421: worker is 1 (out of 1 available) 33277 1726883064.95436: exiting _queue_task() for managed_node2/fail 33277 1726883064.95447: done queuing things up, now waiting for results queue to drain 33277 1726883064.95449: waiting for pending results... 33277 1726883064.95843: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 33277 1726883064.95853: in run() - task 0affc7ec-ae25-6628-6da4-000000000065 33277 1726883064.95865: variable 'ansible_search_path' from source: unknown 33277 1726883064.95869: variable 'ansible_search_path' from source: unknown 33277 1726883064.95916: calling self._execute() 33277 1726883064.96217: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.96220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.96225: variable 'omit' from source: magic vars 33277 1726883064.96469: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.96482: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883064.96623: variable 'ansible_distribution_major_version' from source: facts 33277 1726883064.96629: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883064.96632: when evaluation is False, skipping this task 33277 1726883064.96635: _execute() done 33277 1726883064.96638: dumping result to json 33277 1726883064.96640: done dumping result, returning 33277 1726883064.96649: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affc7ec-ae25-6628-6da4-000000000065] 33277 1726883064.96652: sending task result for task 0affc7ec-ae25-6628-6da4-000000000065 33277 1726883064.96759: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000065 33277 1726883064.96763: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883064.96817: no more pending results, returning what we have 33277 1726883064.96822: results queue empty 33277 1726883064.96824: checking for any_errors_fatal 33277 1726883064.96831: done checking for any_errors_fatal 33277 1726883064.96832: checking for max_fail_percentage 33277 1726883064.96834: done checking for max_fail_percentage 33277 1726883064.96834: checking to see if all hosts have failed and the running result is not ok 33277 1726883064.96835: done checking to see if all hosts have failed 33277 1726883064.96836: getting the remaining hosts for this loop 33277 1726883064.96838: done getting the remaining hosts for this loop 33277 1726883064.96843: getting the next task for host managed_node2 33277 1726883064.96849: done getting next task for host managed_node2 33277 1726883064.96853: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 33277 1726883064.96856: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883064.96877: getting variables 33277 1726883064.96878: in VariableManager get_vars() 33277 1726883064.96941: Calling all_inventory to load vars for managed_node2 33277 1726883064.96944: Calling groups_inventory to load vars for managed_node2 33277 1726883064.96947: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883064.96961: Calling all_plugins_play to load vars for managed_node2 33277 1726883064.96963: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883064.96967: Calling groups_plugins_play to load vars for managed_node2 33277 1726883064.97331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883064.97607: done with get_vars() 33277 1726883064.97618: done getting variables 33277 1726883064.97692: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:44:24 -0400 (0:00:00.026) 0:00:08.165 ****** 33277 1726883064.97727: entering _queue_task() for managed_node2/fail 33277 1726883064.98294: worker is 1 (out of 1 available) 33277 1726883064.98306: exiting _queue_task() for managed_node2/fail 33277 1726883064.98316: done queuing things up, now waiting for results queue to drain 33277 1726883064.98318: waiting for pending results... 33277 1726883064.98908: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 33277 1726883064.99296: in run() - task 0affc7ec-ae25-6628-6da4-000000000066 33277 1726883064.99312: variable 'ansible_search_path' from source: unknown 33277 1726883064.99316: variable 'ansible_search_path' from source: unknown 33277 1726883064.99440: calling self._execute() 33277 1726883064.99565: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883064.99571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883064.99581: variable 'omit' from source: magic vars 33277 1726883065.00449: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.00461: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.00747: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.00751: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.00754: when evaluation is False, skipping this task 33277 1726883065.00756: _execute() done 33277 1726883065.00759: dumping result to json 33277 1726883065.00762: done dumping result, returning 33277 1726883065.00768: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affc7ec-ae25-6628-6da4-000000000066] 33277 1726883065.00774: sending task result for task 0affc7ec-ae25-6628-6da4-000000000066 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.01046: no more pending results, returning what we have 33277 1726883065.01051: results queue empty 33277 1726883065.01052: checking for any_errors_fatal 33277 1726883065.01058: done checking for any_errors_fatal 33277 1726883065.01059: checking for max_fail_percentage 33277 1726883065.01060: done checking for max_fail_percentage 33277 1726883065.01061: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.01062: done checking to see if all hosts have failed 33277 1726883065.01062: getting the remaining hosts for this loop 33277 1726883065.01064: done getting the remaining hosts for this loop 33277 1726883065.01069: getting the next task for host managed_node2 33277 1726883065.01075: done getting next task for host managed_node2 33277 1726883065.01080: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 33277 1726883065.01082: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.01113: getting variables 33277 1726883065.01115: in VariableManager get_vars() 33277 1726883065.01171: Calling all_inventory to load vars for managed_node2 33277 1726883065.01174: Calling groups_inventory to load vars for managed_node2 33277 1726883065.01177: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.01195: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.01199: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.01204: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.01779: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000066 33277 1726883065.02108: WORKER PROCESS EXITING 33277 1726883065.02136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.02613: done with get_vars() 33277 1726883065.02625: done getting variables 33277 1726883065.02802: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:44:25 -0400 (0:00:00.051) 0:00:08.217 ****** 33277 1726883065.02868: entering _queue_task() for managed_node2/fail 33277 1726883065.03496: worker is 1 (out of 1 available) 33277 1726883065.03625: exiting _queue_task() for managed_node2/fail 33277 1726883065.03638: done queuing things up, now waiting for results queue to drain 33277 1726883065.03641: waiting for pending results... 33277 1726883065.04160: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 33277 1726883065.04271: in run() - task 0affc7ec-ae25-6628-6da4-000000000067 33277 1726883065.04290: variable 'ansible_search_path' from source: unknown 33277 1726883065.04474: variable 'ansible_search_path' from source: unknown 33277 1726883065.04478: calling self._execute() 33277 1726883065.04590: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.04603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.04616: variable 'omit' from source: magic vars 33277 1726883065.05067: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.05086: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.05212: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.05226: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.05240: when evaluation is False, skipping this task 33277 1726883065.05249: _execute() done 33277 1726883065.05256: dumping result to json 33277 1726883065.05264: done dumping result, returning 33277 1726883065.05277: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affc7ec-ae25-6628-6da4-000000000067] 33277 1726883065.05287: sending task result for task 0affc7ec-ae25-6628-6da4-000000000067 33277 1726883065.05429: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000067 33277 1726883065.05432: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.05571: no more pending results, returning what we have 33277 1726883065.05576: results queue empty 33277 1726883065.05577: checking for any_errors_fatal 33277 1726883065.05582: done checking for any_errors_fatal 33277 1726883065.05583: checking for max_fail_percentage 33277 1726883065.05585: done checking for max_fail_percentage 33277 1726883065.05588: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.05589: done checking to see if all hosts have failed 33277 1726883065.05589: getting the remaining hosts for this loop 33277 1726883065.05591: done getting the remaining hosts for this loop 33277 1726883065.05594: getting the next task for host managed_node2 33277 1726883065.05600: done getting next task for host managed_node2 33277 1726883065.05603: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 33277 1726883065.05606: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.05620: getting variables 33277 1726883065.05624: in VariableManager get_vars() 33277 1726883065.05670: Calling all_inventory to load vars for managed_node2 33277 1726883065.05673: Calling groups_inventory to load vars for managed_node2 33277 1726883065.05675: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.05684: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.05690: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.05693: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.06015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.06334: done with get_vars() 33277 1726883065.06345: done getting variables 33277 1726883065.06416: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:44:25 -0400 (0:00:00.035) 0:00:08.252 ****** 33277 1726883065.06464: entering _queue_task() for managed_node2/dnf 33277 1726883065.06860: worker is 1 (out of 1 available) 33277 1726883065.06872: exiting _queue_task() for managed_node2/dnf 33277 1726883065.06882: done queuing things up, now waiting for results queue to drain 33277 1726883065.06883: waiting for pending results... 33277 1726883065.07154: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 33277 1726883065.07313: in run() - task 0affc7ec-ae25-6628-6da4-000000000068 33277 1726883065.07318: variable 'ansible_search_path' from source: unknown 33277 1726883065.07321: variable 'ansible_search_path' from source: unknown 33277 1726883065.07422: calling self._execute() 33277 1726883065.07450: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.07454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.07464: variable 'omit' from source: magic vars 33277 1726883065.07979: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.08016: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.08199: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.08203: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.08205: when evaluation is False, skipping this task 33277 1726883065.08208: _execute() done 33277 1726883065.08210: dumping result to json 33277 1726883065.08212: done dumping result, returning 33277 1726883065.08294: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affc7ec-ae25-6628-6da4-000000000068] 33277 1726883065.08298: sending task result for task 0affc7ec-ae25-6628-6da4-000000000068 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.08425: no more pending results, returning what we have 33277 1726883065.08429: results queue empty 33277 1726883065.08430: checking for any_errors_fatal 33277 1726883065.08445: done checking for any_errors_fatal 33277 1726883065.08446: checking for max_fail_percentage 33277 1726883065.08448: done checking for max_fail_percentage 33277 1726883065.08448: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.08450: done checking to see if all hosts have failed 33277 1726883065.08451: getting the remaining hosts for this loop 33277 1726883065.08452: done getting the remaining hosts for this loop 33277 1726883065.08456: getting the next task for host managed_node2 33277 1726883065.08462: done getting next task for host managed_node2 33277 1726883065.08467: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 33277 1726883065.08470: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.08492: getting variables 33277 1726883065.08495: in VariableManager get_vars() 33277 1726883065.08705: Calling all_inventory to load vars for managed_node2 33277 1726883065.08708: Calling groups_inventory to load vars for managed_node2 33277 1726883065.08711: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.08720: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.08725: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.08729: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.08983: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000068 33277 1726883065.08988: WORKER PROCESS EXITING 33277 1726883065.09026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.09475: done with get_vars() 33277 1726883065.09488: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 33277 1726883065.09589: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:44:25 -0400 (0:00:00.031) 0:00:08.284 ****** 33277 1726883065.09640: entering _queue_task() for managed_node2/yum 33277 1726883065.09964: worker is 1 (out of 1 available) 33277 1726883065.10093: exiting _queue_task() for managed_node2/yum 33277 1726883065.10105: done queuing things up, now waiting for results queue to drain 33277 1726883065.10106: waiting for pending results... 33277 1726883065.10308: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 33277 1726883065.10447: in run() - task 0affc7ec-ae25-6628-6da4-000000000069 33277 1726883065.10464: variable 'ansible_search_path' from source: unknown 33277 1726883065.10467: variable 'ansible_search_path' from source: unknown 33277 1726883065.10508: calling self._execute() 33277 1726883065.10603: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.10698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.10702: variable 'omit' from source: magic vars 33277 1726883065.11045: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.11057: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.11186: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.11193: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.11197: when evaluation is False, skipping this task 33277 1726883065.11199: _execute() done 33277 1726883065.11202: dumping result to json 33277 1726883065.11205: done dumping result, returning 33277 1726883065.11212: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affc7ec-ae25-6628-6da4-000000000069] 33277 1726883065.11217: sending task result for task 0affc7ec-ae25-6628-6da4-000000000069 33277 1726883065.11321: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000069 33277 1726883065.11326: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.11435: no more pending results, returning what we have 33277 1726883065.11439: results queue empty 33277 1726883065.11441: checking for any_errors_fatal 33277 1726883065.11447: done checking for any_errors_fatal 33277 1726883065.11447: checking for max_fail_percentage 33277 1726883065.11449: done checking for max_fail_percentage 33277 1726883065.11450: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.11450: done checking to see if all hosts have failed 33277 1726883065.11451: getting the remaining hosts for this loop 33277 1726883065.11452: done getting the remaining hosts for this loop 33277 1726883065.11456: getting the next task for host managed_node2 33277 1726883065.11462: done getting next task for host managed_node2 33277 1726883065.11466: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 33277 1726883065.11469: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.11490: getting variables 33277 1726883065.11491: in VariableManager get_vars() 33277 1726883065.11646: Calling all_inventory to load vars for managed_node2 33277 1726883065.11649: Calling groups_inventory to load vars for managed_node2 33277 1726883065.11652: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.11661: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.11663: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.11681: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.11914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.12194: done with get_vars() 33277 1726883065.12204: done getting variables 33277 1726883065.12269: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:44:25 -0400 (0:00:00.026) 0:00:08.311 ****** 33277 1726883065.12302: entering _queue_task() for managed_node2/fail 33277 1726883065.12626: worker is 1 (out of 1 available) 33277 1726883065.12638: exiting _queue_task() for managed_node2/fail 33277 1726883065.12650: done queuing things up, now waiting for results queue to drain 33277 1726883065.12652: waiting for pending results... 33277 1726883065.13085: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 33277 1726883065.13098: in run() - task 0affc7ec-ae25-6628-6da4-00000000006a 33277 1726883065.13102: variable 'ansible_search_path' from source: unknown 33277 1726883065.13105: variable 'ansible_search_path' from source: unknown 33277 1726883065.13107: calling self._execute() 33277 1726883065.13208: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.13212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.13214: variable 'omit' from source: magic vars 33277 1726883065.13640: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.13645: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.13748: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.13752: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.13759: when evaluation is False, skipping this task 33277 1726883065.13762: _execute() done 33277 1726883065.13765: dumping result to json 33277 1726883065.13767: done dumping result, returning 33277 1726883065.13770: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-6628-6da4-00000000006a] 33277 1726883065.13772: sending task result for task 0affc7ec-ae25-6628-6da4-00000000006a 33277 1726883065.14005: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000006a 33277 1726883065.14008: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.14054: no more pending results, returning what we have 33277 1726883065.14057: results queue empty 33277 1726883065.14058: checking for any_errors_fatal 33277 1726883065.14065: done checking for any_errors_fatal 33277 1726883065.14066: checking for max_fail_percentage 33277 1726883065.14068: done checking for max_fail_percentage 33277 1726883065.14069: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.14069: done checking to see if all hosts have failed 33277 1726883065.14070: getting the remaining hosts for this loop 33277 1726883065.14071: done getting the remaining hosts for this loop 33277 1726883065.14074: getting the next task for host managed_node2 33277 1726883065.14080: done getting next task for host managed_node2 33277 1726883065.14084: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 33277 1726883065.14089: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.14105: getting variables 33277 1726883065.14107: in VariableManager get_vars() 33277 1726883065.14155: Calling all_inventory to load vars for managed_node2 33277 1726883065.14158: Calling groups_inventory to load vars for managed_node2 33277 1726883065.14160: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.14169: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.14172: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.14175: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.14494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.15228: done with get_vars() 33277 1726883065.15238: done getting variables 33277 1726883065.15303: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:44:25 -0400 (0:00:00.031) 0:00:08.342 ****** 33277 1726883065.15459: entering _queue_task() for managed_node2/package 33277 1726883065.15999: worker is 1 (out of 1 available) 33277 1726883065.16013: exiting _queue_task() for managed_node2/package 33277 1726883065.16027: done queuing things up, now waiting for results queue to drain 33277 1726883065.16029: waiting for pending results... 33277 1726883065.16503: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 33277 1726883065.16707: in run() - task 0affc7ec-ae25-6628-6da4-00000000006b 33277 1726883065.16853: variable 'ansible_search_path' from source: unknown 33277 1726883065.16856: variable 'ansible_search_path' from source: unknown 33277 1726883065.16889: calling self._execute() 33277 1726883065.17231: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.17236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.17238: variable 'omit' from source: magic vars 33277 1726883065.18130: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.18148: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.18558: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.18561: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.18563: when evaluation is False, skipping this task 33277 1726883065.18566: _execute() done 33277 1726883065.18568: dumping result to json 33277 1726883065.18570: done dumping result, returning 33277 1726883065.18573: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affc7ec-ae25-6628-6da4-00000000006b] 33277 1726883065.18575: sending task result for task 0affc7ec-ae25-6628-6da4-00000000006b skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.18824: no more pending results, returning what we have 33277 1726883065.18831: results queue empty 33277 1726883065.18833: checking for any_errors_fatal 33277 1726883065.18841: done checking for any_errors_fatal 33277 1726883065.18842: checking for max_fail_percentage 33277 1726883065.18844: done checking for max_fail_percentage 33277 1726883065.18845: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.18846: done checking to see if all hosts have failed 33277 1726883065.18846: getting the remaining hosts for this loop 33277 1726883065.18848: done getting the remaining hosts for this loop 33277 1726883065.18852: getting the next task for host managed_node2 33277 1726883065.18859: done getting next task for host managed_node2 33277 1726883065.18863: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 33277 1726883065.18866: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.18893: getting variables 33277 1726883065.18895: in VariableManager get_vars() 33277 1726883065.19061: Calling all_inventory to load vars for managed_node2 33277 1726883065.19064: Calling groups_inventory to load vars for managed_node2 33277 1726883065.19067: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.19074: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000006b 33277 1726883065.19088: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.19091: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.19096: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.19577: WORKER PROCESS EXITING 33277 1726883065.19640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.20243: done with get_vars() 33277 1726883065.20254: done getting variables 33277 1726883065.20317: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:44:25 -0400 (0:00:00.049) 0:00:08.392 ****** 33277 1726883065.20438: entering _queue_task() for managed_node2/package 33277 1726883065.21048: worker is 1 (out of 1 available) 33277 1726883065.21061: exiting _queue_task() for managed_node2/package 33277 1726883065.21073: done queuing things up, now waiting for results queue to drain 33277 1726883065.21074: waiting for pending results... 33277 1726883065.21523: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 33277 1726883065.21841: in run() - task 0affc7ec-ae25-6628-6da4-00000000006c 33277 1726883065.21845: variable 'ansible_search_path' from source: unknown 33277 1726883065.21848: variable 'ansible_search_path' from source: unknown 33277 1726883065.21851: calling self._execute() 33277 1726883065.22015: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.22067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.22083: variable 'omit' from source: magic vars 33277 1726883065.23040: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.23044: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.23367: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.23371: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.23373: when evaluation is False, skipping this task 33277 1726883065.23376: _execute() done 33277 1726883065.23378: dumping result to json 33277 1726883065.23380: done dumping result, returning 33277 1726883065.23382: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affc7ec-ae25-6628-6da4-00000000006c] 33277 1726883065.23385: sending task result for task 0affc7ec-ae25-6628-6da4-00000000006c 33277 1726883065.23465: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000006c skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.23526: no more pending results, returning what we have 33277 1726883065.23530: results queue empty 33277 1726883065.23532: checking for any_errors_fatal 33277 1726883065.23540: done checking for any_errors_fatal 33277 1726883065.23541: checking for max_fail_percentage 33277 1726883065.23542: done checking for max_fail_percentage 33277 1726883065.23543: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.23544: done checking to see if all hosts have failed 33277 1726883065.23545: getting the remaining hosts for this loop 33277 1726883065.23546: done getting the remaining hosts for this loop 33277 1726883065.23551: getting the next task for host managed_node2 33277 1726883065.23557: done getting next task for host managed_node2 33277 1726883065.23561: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 33277 1726883065.23564: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.23588: getting variables 33277 1726883065.23590: in VariableManager get_vars() 33277 1726883065.23759: Calling all_inventory to load vars for managed_node2 33277 1726883065.23762: Calling groups_inventory to load vars for managed_node2 33277 1726883065.23764: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.23779: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.23782: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.23788: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.24249: WORKER PROCESS EXITING 33277 1726883065.24266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.24779: done with get_vars() 33277 1726883065.24794: done getting variables 33277 1726883065.24917: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:44:25 -0400 (0:00:00.045) 0:00:08.437 ****** 33277 1726883065.24949: entering _queue_task() for managed_node2/package 33277 1726883065.25737: worker is 1 (out of 1 available) 33277 1726883065.25831: exiting _queue_task() for managed_node2/package 33277 1726883065.25845: done queuing things up, now waiting for results queue to drain 33277 1726883065.25847: waiting for pending results... 33277 1726883065.26318: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 33277 1726883065.26571: in run() - task 0affc7ec-ae25-6628-6da4-00000000006d 33277 1726883065.26735: variable 'ansible_search_path' from source: unknown 33277 1726883065.26746: variable 'ansible_search_path' from source: unknown 33277 1726883065.26794: calling self._execute() 33277 1726883065.26966: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.26984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.27000: variable 'omit' from source: magic vars 33277 1726883065.27860: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.28027: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.28228: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.28231: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.28234: when evaluation is False, skipping this task 33277 1726883065.28236: _execute() done 33277 1726883065.28239: dumping result to json 33277 1726883065.28241: done dumping result, returning 33277 1726883065.28244: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affc7ec-ae25-6628-6da4-00000000006d] 33277 1726883065.28246: sending task result for task 0affc7ec-ae25-6628-6da4-00000000006d 33277 1726883065.28584: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000006d 33277 1726883065.28587: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.28648: no more pending results, returning what we have 33277 1726883065.28651: results queue empty 33277 1726883065.28653: checking for any_errors_fatal 33277 1726883065.28661: done checking for any_errors_fatal 33277 1726883065.28661: checking for max_fail_percentage 33277 1726883065.28663: done checking for max_fail_percentage 33277 1726883065.28663: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.28664: done checking to see if all hosts have failed 33277 1726883065.28665: getting the remaining hosts for this loop 33277 1726883065.28666: done getting the remaining hosts for this loop 33277 1726883065.28670: getting the next task for host managed_node2 33277 1726883065.28676: done getting next task for host managed_node2 33277 1726883065.28680: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 33277 1726883065.28683: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.28710: getting variables 33277 1726883065.28712: in VariableManager get_vars() 33277 1726883065.28768: Calling all_inventory to load vars for managed_node2 33277 1726883065.28772: Calling groups_inventory to load vars for managed_node2 33277 1726883065.28775: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.28789: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.28792: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.28795: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.29389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.29883: done with get_vars() 33277 1726883065.29895: done getting variables 33277 1726883065.30072: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:44:25 -0400 (0:00:00.051) 0:00:08.489 ****** 33277 1726883065.30106: entering _queue_task() for managed_node2/service 33277 1726883065.30681: worker is 1 (out of 1 available) 33277 1726883065.30810: exiting _queue_task() for managed_node2/service 33277 1726883065.30911: done queuing things up, now waiting for results queue to drain 33277 1726883065.30913: waiting for pending results... 33277 1726883065.31195: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 33277 1726883065.31650: in run() - task 0affc7ec-ae25-6628-6da4-00000000006e 33277 1726883065.31654: variable 'ansible_search_path' from source: unknown 33277 1726883065.31656: variable 'ansible_search_path' from source: unknown 33277 1726883065.31659: calling self._execute() 33277 1726883065.31809: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.31824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.31879: variable 'omit' from source: magic vars 33277 1726883065.32620: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.32757: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.32995: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.33048: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.33055: when evaluation is False, skipping this task 33277 1726883065.33111: _execute() done 33277 1726883065.33148: dumping result to json 33277 1726883065.33152: done dumping result, returning 33277 1726883065.33154: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-6628-6da4-00000000006e] 33277 1726883065.33157: sending task result for task 0affc7ec-ae25-6628-6da4-00000000006e 33277 1726883065.33438: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000006e 33277 1726883065.33443: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.33492: no more pending results, returning what we have 33277 1726883065.33496: results queue empty 33277 1726883065.33497: checking for any_errors_fatal 33277 1726883065.33504: done checking for any_errors_fatal 33277 1726883065.33505: checking for max_fail_percentage 33277 1726883065.33506: done checking for max_fail_percentage 33277 1726883065.33507: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.33508: done checking to see if all hosts have failed 33277 1726883065.33508: getting the remaining hosts for this loop 33277 1726883065.33509: done getting the remaining hosts for this loop 33277 1726883065.33513: getting the next task for host managed_node2 33277 1726883065.33519: done getting next task for host managed_node2 33277 1726883065.33627: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 33277 1726883065.33631: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.33652: getting variables 33277 1726883065.33654: in VariableManager get_vars() 33277 1726883065.33701: Calling all_inventory to load vars for managed_node2 33277 1726883065.33704: Calling groups_inventory to load vars for managed_node2 33277 1726883065.33706: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.33715: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.33718: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.33721: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.34246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.34715: done with get_vars() 33277 1726883065.34727: done getting variables 33277 1726883065.34820: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:44:25 -0400 (0:00:00.047) 0:00:08.536 ****** 33277 1726883065.34853: entering _queue_task() for managed_node2/service 33277 1726883065.35165: worker is 1 (out of 1 available) 33277 1726883065.35180: exiting _queue_task() for managed_node2/service 33277 1726883065.35195: done queuing things up, now waiting for results queue to drain 33277 1726883065.35197: waiting for pending results... 33277 1726883065.35541: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 33277 1726883065.35555: in run() - task 0affc7ec-ae25-6628-6da4-00000000006f 33277 1726883065.35566: variable 'ansible_search_path' from source: unknown 33277 1726883065.35574: variable 'ansible_search_path' from source: unknown 33277 1726883065.35614: calling self._execute() 33277 1726883065.35700: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.35712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.35727: variable 'omit' from source: magic vars 33277 1726883065.36202: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.36206: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.36326: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.36338: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.36420: when evaluation is False, skipping this task 33277 1726883065.36425: _execute() done 33277 1726883065.36428: dumping result to json 33277 1726883065.36432: done dumping result, returning 33277 1726883065.36437: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affc7ec-ae25-6628-6da4-00000000006f] 33277 1726883065.36441: sending task result for task 0affc7ec-ae25-6628-6da4-00000000006f 33277 1726883065.36514: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000006f 33277 1726883065.36518: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33277 1726883065.36560: no more pending results, returning what we have 33277 1726883065.36564: results queue empty 33277 1726883065.36565: checking for any_errors_fatal 33277 1726883065.36571: done checking for any_errors_fatal 33277 1726883065.36572: checking for max_fail_percentage 33277 1726883065.36573: done checking for max_fail_percentage 33277 1726883065.36574: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.36575: done checking to see if all hosts have failed 33277 1726883065.36576: getting the remaining hosts for this loop 33277 1726883065.36577: done getting the remaining hosts for this loop 33277 1726883065.36581: getting the next task for host managed_node2 33277 1726883065.36590: done getting next task for host managed_node2 33277 1726883065.36594: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 33277 1726883065.36597: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.36616: getting variables 33277 1726883065.36617: in VariableManager get_vars() 33277 1726883065.36666: Calling all_inventory to load vars for managed_node2 33277 1726883065.36669: Calling groups_inventory to load vars for managed_node2 33277 1726883065.36671: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.36680: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.36682: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.36687: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.37242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.37519: done with get_vars() 33277 1726883065.37533: done getting variables 33277 1726883065.37592: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:44:25 -0400 (0:00:00.027) 0:00:08.564 ****** 33277 1726883065.37625: entering _queue_task() for managed_node2/service 33277 1726883065.37953: worker is 1 (out of 1 available) 33277 1726883065.37971: exiting _queue_task() for managed_node2/service 33277 1726883065.37982: done queuing things up, now waiting for results queue to drain 33277 1726883065.37984: waiting for pending results... 33277 1726883065.38287: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 33277 1726883065.38531: in run() - task 0affc7ec-ae25-6628-6da4-000000000070 33277 1726883065.38534: variable 'ansible_search_path' from source: unknown 33277 1726883065.38537: variable 'ansible_search_path' from source: unknown 33277 1726883065.38540: calling self._execute() 33277 1726883065.38542: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.38545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.38582: variable 'omit' from source: magic vars 33277 1726883065.39040: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.39074: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.39204: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.39208: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.39211: when evaluation is False, skipping this task 33277 1726883065.39213: _execute() done 33277 1726883065.39218: dumping result to json 33277 1726883065.39221: done dumping result, returning 33277 1726883065.39232: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affc7ec-ae25-6628-6da4-000000000070] 33277 1726883065.39237: sending task result for task 0affc7ec-ae25-6628-6da4-000000000070 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.39391: no more pending results, returning what we have 33277 1726883065.39395: results queue empty 33277 1726883065.39396: checking for any_errors_fatal 33277 1726883065.39403: done checking for any_errors_fatal 33277 1726883065.39404: checking for max_fail_percentage 33277 1726883065.39405: done checking for max_fail_percentage 33277 1726883065.39406: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.39407: done checking to see if all hosts have failed 33277 1726883065.39408: getting the remaining hosts for this loop 33277 1726883065.39410: done getting the remaining hosts for this loop 33277 1726883065.39414: getting the next task for host managed_node2 33277 1726883065.39420: done getting next task for host managed_node2 33277 1726883065.39425: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 33277 1726883065.39428: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.39453: getting variables 33277 1726883065.39455: in VariableManager get_vars() 33277 1726883065.39616: Calling all_inventory to load vars for managed_node2 33277 1726883065.39619: Calling groups_inventory to load vars for managed_node2 33277 1726883065.39623: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.39633: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.39635: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.39639: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.39863: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000070 33277 1726883065.39868: WORKER PROCESS EXITING 33277 1726883065.39882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.40164: done with get_vars() 33277 1726883065.40174: done getting variables 33277 1726883065.40235: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:44:25 -0400 (0:00:00.026) 0:00:08.591 ****** 33277 1726883065.40270: entering _queue_task() for managed_node2/service 33277 1726883065.40613: worker is 1 (out of 1 available) 33277 1726883065.40626: exiting _queue_task() for managed_node2/service 33277 1726883065.40634: done queuing things up, now waiting for results queue to drain 33277 1726883065.40636: waiting for pending results... 33277 1726883065.40826: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 33277 1726883065.40970: in run() - task 0affc7ec-ae25-6628-6da4-000000000071 33277 1726883065.41030: variable 'ansible_search_path' from source: unknown 33277 1726883065.41034: variable 'ansible_search_path' from source: unknown 33277 1726883065.41041: calling self._execute() 33277 1726883065.41131: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.41149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.41164: variable 'omit' from source: magic vars 33277 1726883065.41690: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.41694: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.41810: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.41824: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.41832: when evaluation is False, skipping this task 33277 1726883065.41839: _execute() done 33277 1726883065.41846: dumping result to json 33277 1726883065.41853: done dumping result, returning 33277 1726883065.41864: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affc7ec-ae25-6628-6da4-000000000071] 33277 1726883065.41874: sending task result for task 0affc7ec-ae25-6628-6da4-000000000071 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33277 1726883065.42062: no more pending results, returning what we have 33277 1726883065.42066: results queue empty 33277 1726883065.42067: checking for any_errors_fatal 33277 1726883065.42077: done checking for any_errors_fatal 33277 1726883065.42078: checking for max_fail_percentage 33277 1726883065.42080: done checking for max_fail_percentage 33277 1726883065.42081: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.42081: done checking to see if all hosts have failed 33277 1726883065.42083: getting the remaining hosts for this loop 33277 1726883065.42084: done getting the remaining hosts for this loop 33277 1726883065.42089: getting the next task for host managed_node2 33277 1726883065.42095: done getting next task for host managed_node2 33277 1726883065.42099: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 33277 1726883065.42103: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.42239: getting variables 33277 1726883065.42241: in VariableManager get_vars() 33277 1726883065.42292: Calling all_inventory to load vars for managed_node2 33277 1726883065.42295: Calling groups_inventory to load vars for managed_node2 33277 1726883065.42298: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.42310: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.42313: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.42316: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.42434: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000071 33277 1726883065.42438: WORKER PROCESS EXITING 33277 1726883065.42636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.42972: done with get_vars() 33277 1726883065.42982: done getting variables 33277 1726883065.43045: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:44:25 -0400 (0:00:00.028) 0:00:08.619 ****** 33277 1726883065.43081: entering _queue_task() for managed_node2/copy 33277 1726883065.43428: worker is 1 (out of 1 available) 33277 1726883065.43439: exiting _queue_task() for managed_node2/copy 33277 1726883065.43449: done queuing things up, now waiting for results queue to drain 33277 1726883065.43451: waiting for pending results... 33277 1726883065.43767: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 33277 1726883065.43909: in run() - task 0affc7ec-ae25-6628-6da4-000000000072 33277 1726883065.43934: variable 'ansible_search_path' from source: unknown 33277 1726883065.43944: variable 'ansible_search_path' from source: unknown 33277 1726883065.43993: calling self._execute() 33277 1726883065.44088: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.44101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.44129: variable 'omit' from source: magic vars 33277 1726883065.44534: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.44605: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.44686: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.44698: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.44705: when evaluation is False, skipping this task 33277 1726883065.44722: _execute() done 33277 1726883065.44734: dumping result to json 33277 1726883065.44743: done dumping result, returning 33277 1726883065.44756: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affc7ec-ae25-6628-6da4-000000000072] 33277 1726883065.44767: sending task result for task 0affc7ec-ae25-6628-6da4-000000000072 33277 1726883065.44897: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000072 33277 1726883065.44900: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.44971: no more pending results, returning what we have 33277 1726883065.44975: results queue empty 33277 1726883065.44977: checking for any_errors_fatal 33277 1726883065.44983: done checking for any_errors_fatal 33277 1726883065.44984: checking for max_fail_percentage 33277 1726883065.44985: done checking for max_fail_percentage 33277 1726883065.44986: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.44987: done checking to see if all hosts have failed 33277 1726883065.44988: getting the remaining hosts for this loop 33277 1726883065.44990: done getting the remaining hosts for this loop 33277 1726883065.44994: getting the next task for host managed_node2 33277 1726883065.45001: done getting next task for host managed_node2 33277 1726883065.45005: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 33277 1726883065.45008: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.45030: getting variables 33277 1726883065.45032: in VariableManager get_vars() 33277 1726883065.45080: Calling all_inventory to load vars for managed_node2 33277 1726883065.45083: Calling groups_inventory to load vars for managed_node2 33277 1726883065.45085: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.45099: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.45102: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.45106: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.45540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.45797: done with get_vars() 33277 1726883065.45806: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:44:25 -0400 (0:00:00.028) 0:00:08.647 ****** 33277 1726883065.45890: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 33277 1726883065.46192: worker is 1 (out of 1 available) 33277 1726883065.46204: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 33277 1726883065.46215: done queuing things up, now waiting for results queue to drain 33277 1726883065.46217: waiting for pending results... 33277 1726883065.46574: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 33277 1726883065.46662: in run() - task 0affc7ec-ae25-6628-6da4-000000000073 33277 1726883065.46666: variable 'ansible_search_path' from source: unknown 33277 1726883065.46670: variable 'ansible_search_path' from source: unknown 33277 1726883065.46724: calling self._execute() 33277 1726883065.46845: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.46907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.46910: variable 'omit' from source: magic vars 33277 1726883065.47430: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.47447: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.47581: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.47602: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.47610: when evaluation is False, skipping this task 33277 1726883065.47618: _execute() done 33277 1726883065.47701: dumping result to json 33277 1726883065.47704: done dumping result, returning 33277 1726883065.47718: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affc7ec-ae25-6628-6da4-000000000073] 33277 1726883065.47720: sending task result for task 0affc7ec-ae25-6628-6da4-000000000073 33277 1726883065.47796: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000073 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.47971: no more pending results, returning what we have 33277 1726883065.47974: results queue empty 33277 1726883065.47975: checking for any_errors_fatal 33277 1726883065.47981: done checking for any_errors_fatal 33277 1726883065.47982: checking for max_fail_percentage 33277 1726883065.47984: done checking for max_fail_percentage 33277 1726883065.47984: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.47988: done checking to see if all hosts have failed 33277 1726883065.47989: getting the remaining hosts for this loop 33277 1726883065.47990: done getting the remaining hosts for this loop 33277 1726883065.47994: getting the next task for host managed_node2 33277 1726883065.48000: done getting next task for host managed_node2 33277 1726883065.48004: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 33277 1726883065.48006: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.48025: getting variables 33277 1726883065.48027: in VariableManager get_vars() 33277 1726883065.48076: Calling all_inventory to load vars for managed_node2 33277 1726883065.48080: Calling groups_inventory to load vars for managed_node2 33277 1726883065.48082: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.48096: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.48100: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.48103: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.48404: WORKER PROCESS EXITING 33277 1726883065.48437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.48946: done with get_vars() 33277 1726883065.48955: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:44:25 -0400 (0:00:00.032) 0:00:08.680 ****** 33277 1726883065.49189: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 33277 1726883065.49770: worker is 1 (out of 1 available) 33277 1726883065.49782: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 33277 1726883065.49793: done queuing things up, now waiting for results queue to drain 33277 1726883065.49795: waiting for pending results... 33277 1726883065.50001: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 33277 1726883065.50207: in run() - task 0affc7ec-ae25-6628-6da4-000000000074 33277 1726883065.50211: variable 'ansible_search_path' from source: unknown 33277 1726883065.50214: variable 'ansible_search_path' from source: unknown 33277 1726883065.50217: calling self._execute() 33277 1726883065.50277: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.50291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.50304: variable 'omit' from source: magic vars 33277 1726883065.50864: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.50867: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.50938: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.50949: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.50955: when evaluation is False, skipping this task 33277 1726883065.50964: _execute() done 33277 1726883065.51127: dumping result to json 33277 1726883065.51131: done dumping result, returning 33277 1726883065.51134: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affc7ec-ae25-6628-6da4-000000000074] 33277 1726883065.51136: sending task result for task 0affc7ec-ae25-6628-6da4-000000000074 33277 1726883065.51210: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000074 33277 1726883065.51213: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.51268: no more pending results, returning what we have 33277 1726883065.51272: results queue empty 33277 1726883065.51274: checking for any_errors_fatal 33277 1726883065.51282: done checking for any_errors_fatal 33277 1726883065.51283: checking for max_fail_percentage 33277 1726883065.51285: done checking for max_fail_percentage 33277 1726883065.51288: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.51289: done checking to see if all hosts have failed 33277 1726883065.51290: getting the remaining hosts for this loop 33277 1726883065.51292: done getting the remaining hosts for this loop 33277 1726883065.51297: getting the next task for host managed_node2 33277 1726883065.51303: done getting next task for host managed_node2 33277 1726883065.51308: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 33277 1726883065.51311: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.51335: getting variables 33277 1726883065.51337: in VariableManager get_vars() 33277 1726883065.51391: Calling all_inventory to load vars for managed_node2 33277 1726883065.51394: Calling groups_inventory to load vars for managed_node2 33277 1726883065.51397: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.51410: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.51413: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.51416: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.51803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.52064: done with get_vars() 33277 1726883065.52078: done getting variables 33277 1726883065.52141: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:44:25 -0400 (0:00:00.029) 0:00:08.710 ****** 33277 1726883065.52171: entering _queue_task() for managed_node2/debug 33277 1726883065.52534: worker is 1 (out of 1 available) 33277 1726883065.52544: exiting _queue_task() for managed_node2/debug 33277 1726883065.52553: done queuing things up, now waiting for results queue to drain 33277 1726883065.52555: waiting for pending results... 33277 1726883065.52839: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 33277 1726883065.52882: in run() - task 0affc7ec-ae25-6628-6da4-000000000075 33277 1726883065.52930: variable 'ansible_search_path' from source: unknown 33277 1726883065.52956: variable 'ansible_search_path' from source: unknown 33277 1726883065.52982: calling self._execute() 33277 1726883065.53078: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.53108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.53112: variable 'omit' from source: magic vars 33277 1726883065.53531: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.53535: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.53721: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.53726: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.53729: when evaluation is False, skipping this task 33277 1726883065.53731: _execute() done 33277 1726883065.53733: dumping result to json 33277 1726883065.53735: done dumping result, returning 33277 1726883065.53738: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affc7ec-ae25-6628-6da4-000000000075] 33277 1726883065.53740: sending task result for task 0affc7ec-ae25-6628-6da4-000000000075 skipping: [managed_node2] => { "false_condition": "ansible_distribution_major_version == '7'" } 33277 1726883065.53868: no more pending results, returning what we have 33277 1726883065.53872: results queue empty 33277 1726883065.53873: checking for any_errors_fatal 33277 1726883065.53878: done checking for any_errors_fatal 33277 1726883065.53879: checking for max_fail_percentage 33277 1726883065.53880: done checking for max_fail_percentage 33277 1726883065.53881: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.53882: done checking to see if all hosts have failed 33277 1726883065.53883: getting the remaining hosts for this loop 33277 1726883065.53884: done getting the remaining hosts for this loop 33277 1726883065.53891: getting the next task for host managed_node2 33277 1726883065.53898: done getting next task for host managed_node2 33277 1726883065.53902: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 33277 1726883065.53906: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.53927: getting variables 33277 1726883065.53929: in VariableManager get_vars() 33277 1726883065.54096: Calling all_inventory to load vars for managed_node2 33277 1726883065.54099: Calling groups_inventory to load vars for managed_node2 33277 1726883065.54102: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.54112: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.54115: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.54118: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.54230: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000075 33277 1726883065.54233: WORKER PROCESS EXITING 33277 1726883065.54444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.54927: done with get_vars() 33277 1726883065.54937: done getting variables 33277 1726883065.55000: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:44:25 -0400 (0:00:00.028) 0:00:08.739 ****** 33277 1726883065.55071: entering _queue_task() for managed_node2/debug 33277 1726883065.55430: worker is 1 (out of 1 available) 33277 1726883065.55444: exiting _queue_task() for managed_node2/debug 33277 1726883065.55456: done queuing things up, now waiting for results queue to drain 33277 1726883065.55458: waiting for pending results... 33277 1726883065.55920: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 33277 1726883065.56231: in run() - task 0affc7ec-ae25-6628-6da4-000000000076 33277 1726883065.56248: variable 'ansible_search_path' from source: unknown 33277 1726883065.56253: variable 'ansible_search_path' from source: unknown 33277 1726883065.56347: calling self._execute() 33277 1726883065.56545: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.56556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.56560: variable 'omit' from source: magic vars 33277 1726883065.58026: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.58030: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.58430: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.58435: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.58438: when evaluation is False, skipping this task 33277 1726883065.58441: _execute() done 33277 1726883065.58444: dumping result to json 33277 1726883065.58446: done dumping result, returning 33277 1726883065.58629: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affc7ec-ae25-6628-6da4-000000000076] 33277 1726883065.58633: sending task result for task 0affc7ec-ae25-6628-6da4-000000000076 skipping: [managed_node2] => { "false_condition": "ansible_distribution_major_version == '7'" } 33277 1726883065.58873: no more pending results, returning what we have 33277 1726883065.58877: results queue empty 33277 1726883065.58879: checking for any_errors_fatal 33277 1726883065.58891: done checking for any_errors_fatal 33277 1726883065.58892: checking for max_fail_percentage 33277 1726883065.58893: done checking for max_fail_percentage 33277 1726883065.58894: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.58895: done checking to see if all hosts have failed 33277 1726883065.58896: getting the remaining hosts for this loop 33277 1726883065.58899: done getting the remaining hosts for this loop 33277 1726883065.58903: getting the next task for host managed_node2 33277 1726883065.58910: done getting next task for host managed_node2 33277 1726883065.58915: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 33277 1726883065.58918: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.58940: getting variables 33277 1726883065.58942: in VariableManager get_vars() 33277 1726883065.59001: Calling all_inventory to load vars for managed_node2 33277 1726883065.59004: Calling groups_inventory to load vars for managed_node2 33277 1726883065.59007: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.59020: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.59343: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.59348: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.60096: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000076 33277 1726883065.60100: WORKER PROCESS EXITING 33277 1726883065.60339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.60797: done with get_vars() 33277 1726883065.60807: done getting variables 33277 1726883065.60898: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:44:25 -0400 (0:00:00.058) 0:00:08.797 ****** 33277 1726883065.60932: entering _queue_task() for managed_node2/debug 33277 1726883065.61356: worker is 1 (out of 1 available) 33277 1726883065.61367: exiting _queue_task() for managed_node2/debug 33277 1726883065.61377: done queuing things up, now waiting for results queue to drain 33277 1726883065.61379: waiting for pending results... 33277 1726883065.61559: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 33277 1726883065.61717: in run() - task 0affc7ec-ae25-6628-6da4-000000000077 33277 1726883065.61745: variable 'ansible_search_path' from source: unknown 33277 1726883065.61753: variable 'ansible_search_path' from source: unknown 33277 1726883065.61800: calling self._execute() 33277 1726883065.61905: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.61929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.62029: variable 'omit' from source: magic vars 33277 1726883065.62384: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.62404: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.62549: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.62560: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.62567: when evaluation is False, skipping this task 33277 1726883065.62573: _execute() done 33277 1726883065.62583: dumping result to json 33277 1726883065.62596: done dumping result, returning 33277 1726883065.62608: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affc7ec-ae25-6628-6da4-000000000077] 33277 1726883065.62617: sending task result for task 0affc7ec-ae25-6628-6da4-000000000077 33277 1726883065.62766: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000077 33277 1726883065.62770: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "ansible_distribution_major_version == '7'" } 33277 1726883065.62852: no more pending results, returning what we have 33277 1726883065.62856: results queue empty 33277 1726883065.62857: checking for any_errors_fatal 33277 1726883065.62867: done checking for any_errors_fatal 33277 1726883065.62867: checking for max_fail_percentage 33277 1726883065.62869: done checking for max_fail_percentage 33277 1726883065.62870: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.62871: done checking to see if all hosts have failed 33277 1726883065.62872: getting the remaining hosts for this loop 33277 1726883065.62873: done getting the remaining hosts for this loop 33277 1726883065.62878: getting the next task for host managed_node2 33277 1726883065.62884: done getting next task for host managed_node2 33277 1726883065.62892: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 33277 1726883065.62895: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.63036: getting variables 33277 1726883065.63039: in VariableManager get_vars() 33277 1726883065.63245: Calling all_inventory to load vars for managed_node2 33277 1726883065.63248: Calling groups_inventory to load vars for managed_node2 33277 1726883065.63250: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.63259: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.63261: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.63264: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.63615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.63977: done with get_vars() 33277 1726883065.64003: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:44:25 -0400 (0:00:00.031) 0:00:08.829 ****** 33277 1726883065.64128: entering _queue_task() for managed_node2/ping 33277 1726883065.64480: worker is 1 (out of 1 available) 33277 1726883065.64497: exiting _queue_task() for managed_node2/ping 33277 1726883065.64508: done queuing things up, now waiting for results queue to drain 33277 1726883065.64510: waiting for pending results... 33277 1726883065.64892: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 33277 1726883065.64917: in run() - task 0affc7ec-ae25-6628-6da4-000000000078 33277 1726883065.64940: variable 'ansible_search_path' from source: unknown 33277 1726883065.64948: variable 'ansible_search_path' from source: unknown 33277 1726883065.65013: calling self._execute() 33277 1726883065.65124: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.65137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.65152: variable 'omit' from source: magic vars 33277 1726883065.65651: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.65751: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.65821: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.65835: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.65843: when evaluation is False, skipping this task 33277 1726883065.65855: _execute() done 33277 1726883065.65866: dumping result to json 33277 1726883065.65874: done dumping result, returning 33277 1726883065.65898: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affc7ec-ae25-6628-6da4-000000000078] 33277 1726883065.65909: sending task result for task 0affc7ec-ae25-6628-6da4-000000000078 33277 1726883065.66059: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000078 33277 1726883065.66062: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.66124: no more pending results, returning what we have 33277 1726883065.66128: results queue empty 33277 1726883065.66130: checking for any_errors_fatal 33277 1726883065.66138: done checking for any_errors_fatal 33277 1726883065.66139: checking for max_fail_percentage 33277 1726883065.66141: done checking for max_fail_percentage 33277 1726883065.66142: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.66143: done checking to see if all hosts have failed 33277 1726883065.66144: getting the remaining hosts for this loop 33277 1726883065.66145: done getting the remaining hosts for this loop 33277 1726883065.66150: getting the next task for host managed_node2 33277 1726883065.66159: done getting next task for host managed_node2 33277 1726883065.66161: ^ task is: TASK: meta (role_complete) 33277 1726883065.66164: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.66396: getting variables 33277 1726883065.66398: in VariableManager get_vars() 33277 1726883065.66445: Calling all_inventory to load vars for managed_node2 33277 1726883065.66448: Calling groups_inventory to load vars for managed_node2 33277 1726883065.66451: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.66459: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.66462: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.66466: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.66779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.67054: done with get_vars() 33277 1726883065.67064: done getting variables 33277 1726883065.67155: done queuing things up, now waiting for results queue to drain 33277 1726883065.67157: results queue empty 33277 1726883065.67157: checking for any_errors_fatal 33277 1726883065.67159: done checking for any_errors_fatal 33277 1726883065.67160: checking for max_fail_percentage 33277 1726883065.67161: done checking for max_fail_percentage 33277 1726883065.67162: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.67163: done checking to see if all hosts have failed 33277 1726883065.67163: getting the remaining hosts for this loop 33277 1726883065.67164: done getting the remaining hosts for this loop 33277 1726883065.67166: getting the next task for host managed_node2 33277 1726883065.67171: done getting next task for host managed_node2 33277 1726883065.67172: ^ task is: TASK: TEST: wireless connection with 802.1x TLS-EAP 33277 1726883065.67174: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.67176: getting variables 33277 1726883065.67177: in VariableManager get_vars() 33277 1726883065.67197: Calling all_inventory to load vars for managed_node2 33277 1726883065.67211: Calling groups_inventory to load vars for managed_node2 33277 1726883065.67214: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.67219: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.67223: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.67227: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.67454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.67704: done with get_vars() 33277 1726883065.67713: done getting variables 33277 1726883065.67767: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: wireless connection with 802.1x TLS-EAP] *************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:53 Friday 20 September 2024 21:44:25 -0400 (0:00:00.036) 0:00:08.866 ****** 33277 1726883065.67800: entering _queue_task() for managed_node2/debug 33277 1726883065.68152: worker is 1 (out of 1 available) 33277 1726883065.68164: exiting _queue_task() for managed_node2/debug 33277 1726883065.68175: done queuing things up, now waiting for results queue to drain 33277 1726883065.68177: waiting for pending results... 33277 1726883065.68462: running TaskExecutor() for managed_node2/TASK: TEST: wireless connection with 802.1x TLS-EAP 33277 1726883065.68560: in run() - task 0affc7ec-ae25-6628-6da4-0000000000a8 33277 1726883065.68565: variable 'ansible_search_path' from source: unknown 33277 1726883065.68595: calling self._execute() 33277 1726883065.68728: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.68733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.68736: variable 'omit' from source: magic vars 33277 1726883065.69182: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.69224: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.69351: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.69363: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.69382: when evaluation is False, skipping this task 33277 1726883065.69431: _execute() done 33277 1726883065.69435: dumping result to json 33277 1726883065.69438: done dumping result, returning 33277 1726883065.69440: done running TaskExecutor() for managed_node2/TASK: TEST: wireless connection with 802.1x TLS-EAP [0affc7ec-ae25-6628-6da4-0000000000a8] 33277 1726883065.69443: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000a8 skipping: [managed_node2] => { "false_condition": "ansible_distribution_major_version == '7'" } 33277 1726883065.69677: no more pending results, returning what we have 33277 1726883065.69681: results queue empty 33277 1726883065.69683: checking for any_errors_fatal 33277 1726883065.69684: done checking for any_errors_fatal 33277 1726883065.69688: checking for max_fail_percentage 33277 1726883065.69689: done checking for max_fail_percentage 33277 1726883065.69690: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.69691: done checking to see if all hosts have failed 33277 1726883065.69692: getting the remaining hosts for this loop 33277 1726883065.69693: done getting the remaining hosts for this loop 33277 1726883065.69697: getting the next task for host managed_node2 33277 1726883065.69705: done getting next task for host managed_node2 33277 1726883065.69712: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 33277 1726883065.69715: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.69740: getting variables 33277 1726883065.69742: in VariableManager get_vars() 33277 1726883065.69794: Calling all_inventory to load vars for managed_node2 33277 1726883065.69797: Calling groups_inventory to load vars for managed_node2 33277 1726883065.69799: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.69812: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.69815: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.69819: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.69944: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000a8 33277 1726883065.69948: WORKER PROCESS EXITING 33277 1726883065.70144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.70412: done with get_vars() 33277 1726883065.70423: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:44:25 -0400 (0:00:00.028) 0:00:08.894 ****** 33277 1726883065.70644: entering _queue_task() for managed_node2/include_tasks 33277 1726883065.70901: worker is 1 (out of 1 available) 33277 1726883065.70917: exiting _queue_task() for managed_node2/include_tasks 33277 1726883065.70934: done queuing things up, now waiting for results queue to drain 33277 1726883065.70935: waiting for pending results... 33277 1726883065.71174: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 33277 1726883065.71279: in run() - task 0affc7ec-ae25-6628-6da4-0000000000b0 33277 1726883065.71292: variable 'ansible_search_path' from source: unknown 33277 1726883065.71296: variable 'ansible_search_path' from source: unknown 33277 1726883065.71326: calling self._execute() 33277 1726883065.71390: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.71394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.71402: variable 'omit' from source: magic vars 33277 1726883065.71675: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.71690: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.71769: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.71773: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.71776: when evaluation is False, skipping this task 33277 1726883065.71779: _execute() done 33277 1726883065.71781: dumping result to json 33277 1726883065.71783: done dumping result, returning 33277 1726883065.71795: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affc7ec-ae25-6628-6da4-0000000000b0] 33277 1726883065.71798: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000b0 33277 1726883065.71899: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000b0 33277 1726883065.71902: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.71956: no more pending results, returning what we have 33277 1726883065.71959: results queue empty 33277 1726883065.71960: checking for any_errors_fatal 33277 1726883065.71966: done checking for any_errors_fatal 33277 1726883065.71967: checking for max_fail_percentage 33277 1726883065.71968: done checking for max_fail_percentage 33277 1726883065.71968: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.71969: done checking to see if all hosts have failed 33277 1726883065.71970: getting the remaining hosts for this loop 33277 1726883065.71971: done getting the remaining hosts for this loop 33277 1726883065.71974: getting the next task for host managed_node2 33277 1726883065.71978: done getting next task for host managed_node2 33277 1726883065.71982: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 33277 1726883065.71984: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.72006: getting variables 33277 1726883065.72008: in VariableManager get_vars() 33277 1726883065.72046: Calling all_inventory to load vars for managed_node2 33277 1726883065.72048: Calling groups_inventory to load vars for managed_node2 33277 1726883065.72049: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.72056: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.72057: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.72059: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.72206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.72345: done with get_vars() 33277 1726883065.72352: done getting variables 33277 1726883065.72398: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:44:25 -0400 (0:00:00.017) 0:00:08.912 ****** 33277 1726883065.72419: entering _queue_task() for managed_node2/debug 33277 1726883065.72603: worker is 1 (out of 1 available) 33277 1726883065.72616: exiting _queue_task() for managed_node2/debug 33277 1726883065.72627: done queuing things up, now waiting for results queue to drain 33277 1726883065.72629: waiting for pending results... 33277 1726883065.72806: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 33277 1726883065.72889: in run() - task 0affc7ec-ae25-6628-6da4-0000000000b1 33277 1726883065.72904: variable 'ansible_search_path' from source: unknown 33277 1726883065.72907: variable 'ansible_search_path' from source: unknown 33277 1726883065.72939: calling self._execute() 33277 1726883065.73008: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.73012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.73020: variable 'omit' from source: magic vars 33277 1726883065.73298: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.73308: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.73386: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.73395: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.73398: when evaluation is False, skipping this task 33277 1726883065.73401: _execute() done 33277 1726883065.73404: dumping result to json 33277 1726883065.73406: done dumping result, returning 33277 1726883065.73418: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affc7ec-ae25-6628-6da4-0000000000b1] 33277 1726883065.73421: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000b1 33277 1726883065.73503: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000b1 33277 1726883065.73507: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "ansible_distribution_major_version == '7'" } 33277 1726883065.73563: no more pending results, returning what we have 33277 1726883065.73571: results queue empty 33277 1726883065.73572: checking for any_errors_fatal 33277 1726883065.73576: done checking for any_errors_fatal 33277 1726883065.73577: checking for max_fail_percentage 33277 1726883065.73579: done checking for max_fail_percentage 33277 1726883065.73580: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.73581: done checking to see if all hosts have failed 33277 1726883065.73581: getting the remaining hosts for this loop 33277 1726883065.73582: done getting the remaining hosts for this loop 33277 1726883065.73585: getting the next task for host managed_node2 33277 1726883065.73590: done getting next task for host managed_node2 33277 1726883065.73593: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 33277 1726883065.73596: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.73613: getting variables 33277 1726883065.73614: in VariableManager get_vars() 33277 1726883065.73656: Calling all_inventory to load vars for managed_node2 33277 1726883065.73659: Calling groups_inventory to load vars for managed_node2 33277 1726883065.73661: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.73673: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.73676: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.73680: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.73871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.74119: done with get_vars() 33277 1726883065.74132: done getting variables 33277 1726883065.74200: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:44:25 -0400 (0:00:00.018) 0:00:08.930 ****** 33277 1726883065.74235: entering _queue_task() for managed_node2/fail 33277 1726883065.74715: worker is 1 (out of 1 available) 33277 1726883065.74731: exiting _queue_task() for managed_node2/fail 33277 1726883065.74745: done queuing things up, now waiting for results queue to drain 33277 1726883065.74747: waiting for pending results... 33277 1726883065.75096: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 33277 1726883065.75328: in run() - task 0affc7ec-ae25-6628-6da4-0000000000b2 33277 1726883065.75332: variable 'ansible_search_path' from source: unknown 33277 1726883065.75335: variable 'ansible_search_path' from source: unknown 33277 1726883065.75338: calling self._execute() 33277 1726883065.75384: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.75393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.75401: variable 'omit' from source: magic vars 33277 1726883065.75785: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.75804: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.75917: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.75924: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.75927: when evaluation is False, skipping this task 33277 1726883065.75930: _execute() done 33277 1726883065.75933: dumping result to json 33277 1726883065.75935: done dumping result, returning 33277 1726883065.75944: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affc7ec-ae25-6628-6da4-0000000000b2] 33277 1726883065.75948: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000b2 33277 1726883065.76132: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000b2 33277 1726883065.76135: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.76168: no more pending results, returning what we have 33277 1726883065.76171: results queue empty 33277 1726883065.76172: checking for any_errors_fatal 33277 1726883065.76176: done checking for any_errors_fatal 33277 1726883065.76177: checking for max_fail_percentage 33277 1726883065.76179: done checking for max_fail_percentage 33277 1726883065.76179: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.76180: done checking to see if all hosts have failed 33277 1726883065.76181: getting the remaining hosts for this loop 33277 1726883065.76182: done getting the remaining hosts for this loop 33277 1726883065.76185: getting the next task for host managed_node2 33277 1726883065.76190: done getting next task for host managed_node2 33277 1726883065.76194: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 33277 1726883065.76197: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.76212: getting variables 33277 1726883065.76213: in VariableManager get_vars() 33277 1726883065.76263: Calling all_inventory to load vars for managed_node2 33277 1726883065.76266: Calling groups_inventory to load vars for managed_node2 33277 1726883065.76269: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.76277: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.76280: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.76283: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.76530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.76796: done with get_vars() 33277 1726883065.76806: done getting variables 33277 1726883065.76875: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:44:25 -0400 (0:00:00.026) 0:00:08.957 ****** 33277 1726883065.76915: entering _queue_task() for managed_node2/fail 33277 1726883065.77180: worker is 1 (out of 1 available) 33277 1726883065.77194: exiting _queue_task() for managed_node2/fail 33277 1726883065.77206: done queuing things up, now waiting for results queue to drain 33277 1726883065.77208: waiting for pending results... 33277 1726883065.77525: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 33277 1726883065.77769: in run() - task 0affc7ec-ae25-6628-6da4-0000000000b3 33277 1726883065.77775: variable 'ansible_search_path' from source: unknown 33277 1726883065.77777: variable 'ansible_search_path' from source: unknown 33277 1726883065.77780: calling self._execute() 33277 1726883065.77855: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.77875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.77899: variable 'omit' from source: magic vars 33277 1726883065.78330: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.78348: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.78480: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.78494: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.78501: when evaluation is False, skipping this task 33277 1726883065.78507: _execute() done 33277 1726883065.78532: dumping result to json 33277 1726883065.78537: done dumping result, returning 33277 1726883065.78545: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affc7ec-ae25-6628-6da4-0000000000b3] 33277 1726883065.78627: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000b3 33277 1726883065.78703: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000b3 33277 1726883065.78706: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.78760: no more pending results, returning what we have 33277 1726883065.78763: results queue empty 33277 1726883065.78764: checking for any_errors_fatal 33277 1726883065.78770: done checking for any_errors_fatal 33277 1726883065.78771: checking for max_fail_percentage 33277 1726883065.78773: done checking for max_fail_percentage 33277 1726883065.78774: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.78774: done checking to see if all hosts have failed 33277 1726883065.78775: getting the remaining hosts for this loop 33277 1726883065.78776: done getting the remaining hosts for this loop 33277 1726883065.78780: getting the next task for host managed_node2 33277 1726883065.78789: done getting next task for host managed_node2 33277 1726883065.78793: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 33277 1726883065.78796: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.78991: getting variables 33277 1726883065.78993: in VariableManager get_vars() 33277 1726883065.79036: Calling all_inventory to load vars for managed_node2 33277 1726883065.79039: Calling groups_inventory to load vars for managed_node2 33277 1726883065.79041: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.79050: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.79052: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.79055: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.79272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.79548: done with get_vars() 33277 1726883065.79558: done getting variables 33277 1726883065.79617: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:44:25 -0400 (0:00:00.027) 0:00:08.984 ****** 33277 1726883065.79658: entering _queue_task() for managed_node2/fail 33277 1726883065.80053: worker is 1 (out of 1 available) 33277 1726883065.80063: exiting _queue_task() for managed_node2/fail 33277 1726883065.80073: done queuing things up, now waiting for results queue to drain 33277 1726883065.80074: waiting for pending results... 33277 1726883065.80346: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 33277 1726883065.80351: in run() - task 0affc7ec-ae25-6628-6da4-0000000000b4 33277 1726883065.80428: variable 'ansible_search_path' from source: unknown 33277 1726883065.80431: variable 'ansible_search_path' from source: unknown 33277 1726883065.80444: calling self._execute() 33277 1726883065.80499: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.80509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.80519: variable 'omit' from source: magic vars 33277 1726883065.81054: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.81073: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.81235: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.81246: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.81254: when evaluation is False, skipping this task 33277 1726883065.81260: _execute() done 33277 1726883065.81267: dumping result to json 33277 1726883065.81273: done dumping result, returning 33277 1726883065.81284: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affc7ec-ae25-6628-6da4-0000000000b4] 33277 1726883065.81298: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000b4 33277 1726883065.81497: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000b4 33277 1726883065.81501: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.81571: no more pending results, returning what we have 33277 1726883065.81574: results queue empty 33277 1726883065.81576: checking for any_errors_fatal 33277 1726883065.81582: done checking for any_errors_fatal 33277 1726883065.81583: checking for max_fail_percentage 33277 1726883065.81588: done checking for max_fail_percentage 33277 1726883065.81589: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.81589: done checking to see if all hosts have failed 33277 1726883065.81590: getting the remaining hosts for this loop 33277 1726883065.81592: done getting the remaining hosts for this loop 33277 1726883065.81596: getting the next task for host managed_node2 33277 1726883065.81602: done getting next task for host managed_node2 33277 1726883065.81606: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 33277 1726883065.81620: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.81657: getting variables 33277 1726883065.81660: in VariableManager get_vars() 33277 1726883065.81759: Calling all_inventory to load vars for managed_node2 33277 1726883065.81763: Calling groups_inventory to load vars for managed_node2 33277 1726883065.81765: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.81777: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.81779: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.81782: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.82148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.82420: done with get_vars() 33277 1726883065.82432: done getting variables 33277 1726883065.82502: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:44:25 -0400 (0:00:00.028) 0:00:09.013 ****** 33277 1726883065.82538: entering _queue_task() for managed_node2/dnf 33277 1726883065.82796: worker is 1 (out of 1 available) 33277 1726883065.82808: exiting _queue_task() for managed_node2/dnf 33277 1726883065.82933: done queuing things up, now waiting for results queue to drain 33277 1726883065.82935: waiting for pending results... 33277 1726883065.83399: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 33277 1726883065.83428: in run() - task 0affc7ec-ae25-6628-6da4-0000000000b5 33277 1726883065.83448: variable 'ansible_search_path' from source: unknown 33277 1726883065.83455: variable 'ansible_search_path' from source: unknown 33277 1726883065.83503: calling self._execute() 33277 1726883065.83596: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.83615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.83632: variable 'omit' from source: magic vars 33277 1726883065.84037: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.84059: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.84227: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.84230: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.84232: when evaluation is False, skipping this task 33277 1726883065.84235: _execute() done 33277 1726883065.84237: dumping result to json 33277 1726883065.84239: done dumping result, returning 33277 1726883065.84249: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affc7ec-ae25-6628-6da4-0000000000b5] 33277 1726883065.84252: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000b5 33277 1726883065.84534: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000b5 33277 1726883065.84537: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.84577: no more pending results, returning what we have 33277 1726883065.84580: results queue empty 33277 1726883065.84581: checking for any_errors_fatal 33277 1726883065.84585: done checking for any_errors_fatal 33277 1726883065.84588: checking for max_fail_percentage 33277 1726883065.84590: done checking for max_fail_percentage 33277 1726883065.84591: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.84592: done checking to see if all hosts have failed 33277 1726883065.84593: getting the remaining hosts for this loop 33277 1726883065.84594: done getting the remaining hosts for this loop 33277 1726883065.84597: getting the next task for host managed_node2 33277 1726883065.84602: done getting next task for host managed_node2 33277 1726883065.84605: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 33277 1726883065.84608: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.84627: getting variables 33277 1726883065.84628: in VariableManager get_vars() 33277 1726883065.84679: Calling all_inventory to load vars for managed_node2 33277 1726883065.84681: Calling groups_inventory to load vars for managed_node2 33277 1726883065.84683: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.84694: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.84696: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.84699: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.84943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.85250: done with get_vars() 33277 1726883065.85260: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 33277 1726883065.85354: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:44:25 -0400 (0:00:00.028) 0:00:09.042 ****** 33277 1726883065.85383: entering _queue_task() for managed_node2/yum 33277 1726883065.85748: worker is 1 (out of 1 available) 33277 1726883065.85761: exiting _queue_task() for managed_node2/yum 33277 1726883065.85771: done queuing things up, now waiting for results queue to drain 33277 1726883065.85773: waiting for pending results... 33277 1726883065.86013: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 33277 1726883065.86120: in run() - task 0affc7ec-ae25-6628-6da4-0000000000b6 33277 1726883065.86144: variable 'ansible_search_path' from source: unknown 33277 1726883065.86152: variable 'ansible_search_path' from source: unknown 33277 1726883065.86204: calling self._execute() 33277 1726883065.86328: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.86335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.86338: variable 'omit' from source: magic vars 33277 1726883065.86766: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.86831: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.86925: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.86942: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.86949: when evaluation is False, skipping this task 33277 1726883065.86955: _execute() done 33277 1726883065.86961: dumping result to json 33277 1726883065.86967: done dumping result, returning 33277 1726883065.86981: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affc7ec-ae25-6628-6da4-0000000000b6] 33277 1726883065.86995: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000b6 33277 1726883065.87226: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000b6 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.87279: no more pending results, returning what we have 33277 1726883065.87283: results queue empty 33277 1726883065.87284: checking for any_errors_fatal 33277 1726883065.87294: done checking for any_errors_fatal 33277 1726883065.87296: checking for max_fail_percentage 33277 1726883065.87297: done checking for max_fail_percentage 33277 1726883065.87298: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.87299: done checking to see if all hosts have failed 33277 1726883065.87300: getting the remaining hosts for this loop 33277 1726883065.87301: done getting the remaining hosts for this loop 33277 1726883065.87410: getting the next task for host managed_node2 33277 1726883065.87422: done getting next task for host managed_node2 33277 1726883065.87427: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 33277 1726883065.87430: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.87440: WORKER PROCESS EXITING 33277 1726883065.87452: getting variables 33277 1726883065.87453: in VariableManager get_vars() 33277 1726883065.87497: Calling all_inventory to load vars for managed_node2 33277 1726883065.87500: Calling groups_inventory to load vars for managed_node2 33277 1726883065.87502: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.87511: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.87514: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.87517: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.87733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.88007: done with get_vars() 33277 1726883065.88016: done getting variables 33277 1726883065.88088: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:44:25 -0400 (0:00:00.027) 0:00:09.069 ****** 33277 1726883065.88121: entering _queue_task() for managed_node2/fail 33277 1726883065.88393: worker is 1 (out of 1 available) 33277 1726883065.88520: exiting _queue_task() for managed_node2/fail 33277 1726883065.88532: done queuing things up, now waiting for results queue to drain 33277 1726883065.88533: waiting for pending results... 33277 1726883065.88745: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 33277 1726883065.88881: in run() - task 0affc7ec-ae25-6628-6da4-0000000000b7 33277 1726883065.88905: variable 'ansible_search_path' from source: unknown 33277 1726883065.88913: variable 'ansible_search_path' from source: unknown 33277 1726883065.88964: calling self._execute() 33277 1726883065.89064: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.89191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.89196: variable 'omit' from source: magic vars 33277 1726883065.89507: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.89535: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.89667: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.89678: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.89684: when evaluation is False, skipping this task 33277 1726883065.89694: _execute() done 33277 1726883065.89701: dumping result to json 33277 1726883065.89708: done dumping result, returning 33277 1726883065.89718: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-6628-6da4-0000000000b7] 33277 1726883065.89730: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000b7 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.89977: no more pending results, returning what we have 33277 1726883065.89981: results queue empty 33277 1726883065.89983: checking for any_errors_fatal 33277 1726883065.89994: done checking for any_errors_fatal 33277 1726883065.89995: checking for max_fail_percentage 33277 1726883065.89996: done checking for max_fail_percentage 33277 1726883065.89997: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.89998: done checking to see if all hosts have failed 33277 1726883065.89999: getting the remaining hosts for this loop 33277 1726883065.90000: done getting the remaining hosts for this loop 33277 1726883065.90004: getting the next task for host managed_node2 33277 1726883065.90011: done getting next task for host managed_node2 33277 1726883065.90014: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 33277 1726883065.90018: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.90043: getting variables 33277 1726883065.90045: in VariableManager get_vars() 33277 1726883065.90097: Calling all_inventory to load vars for managed_node2 33277 1726883065.90100: Calling groups_inventory to load vars for managed_node2 33277 1726883065.90103: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.90115: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.90118: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.90121: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.90301: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000b7 33277 1726883065.90304: WORKER PROCESS EXITING 33277 1726883065.90557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.90834: done with get_vars() 33277 1726883065.90845: done getting variables 33277 1726883065.90912: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:44:25 -0400 (0:00:00.028) 0:00:09.097 ****** 33277 1726883065.90947: entering _queue_task() for managed_node2/package 33277 1726883065.91329: worker is 1 (out of 1 available) 33277 1726883065.91342: exiting _queue_task() for managed_node2/package 33277 1726883065.91351: done queuing things up, now waiting for results queue to drain 33277 1726883065.91353: waiting for pending results... 33277 1726883065.91555: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 33277 1726883065.91708: in run() - task 0affc7ec-ae25-6628-6da4-0000000000b8 33277 1726883065.91730: variable 'ansible_search_path' from source: unknown 33277 1726883065.91738: variable 'ansible_search_path' from source: unknown 33277 1726883065.91789: calling self._execute() 33277 1726883065.91890: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.91909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.91990: variable 'omit' from source: magic vars 33277 1726883065.92356: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.92374: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.92511: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.92529: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.92537: when evaluation is False, skipping this task 33277 1726883065.92545: _execute() done 33277 1726883065.92556: dumping result to json 33277 1726883065.92563: done dumping result, returning 33277 1726883065.92574: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affc7ec-ae25-6628-6da4-0000000000b8] 33277 1726883065.92583: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000b8 33277 1726883065.92815: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000b8 33277 1726883065.92819: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.92880: no more pending results, returning what we have 33277 1726883065.92884: results queue empty 33277 1726883065.92888: checking for any_errors_fatal 33277 1726883065.92896: done checking for any_errors_fatal 33277 1726883065.92897: checking for max_fail_percentage 33277 1726883065.92898: done checking for max_fail_percentage 33277 1726883065.92900: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.92901: done checking to see if all hosts have failed 33277 1726883065.92901: getting the remaining hosts for this loop 33277 1726883065.92903: done getting the remaining hosts for this loop 33277 1726883065.92907: getting the next task for host managed_node2 33277 1726883065.92914: done getting next task for host managed_node2 33277 1726883065.92918: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 33277 1726883065.92920: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.92946: getting variables 33277 1726883065.92948: in VariableManager get_vars() 33277 1726883065.93004: Calling all_inventory to load vars for managed_node2 33277 1726883065.93007: Calling groups_inventory to load vars for managed_node2 33277 1726883065.93009: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.93160: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.93165: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.93168: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.93365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.93636: done with get_vars() 33277 1726883065.93647: done getting variables 33277 1726883065.93832: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:44:25 -0400 (0:00:00.029) 0:00:09.126 ****** 33277 1726883065.93865: entering _queue_task() for managed_node2/package 33277 1726883065.94410: worker is 1 (out of 1 available) 33277 1726883065.94632: exiting _queue_task() for managed_node2/package 33277 1726883065.94643: done queuing things up, now waiting for results queue to drain 33277 1726883065.94645: waiting for pending results... 33277 1726883065.94908: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 33277 1726883065.95176: in run() - task 0affc7ec-ae25-6628-6da4-0000000000b9 33277 1726883065.95181: variable 'ansible_search_path' from source: unknown 33277 1726883065.95184: variable 'ansible_search_path' from source: unknown 33277 1726883065.95218: calling self._execute() 33277 1726883065.95426: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.95503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.95510: variable 'omit' from source: magic vars 33277 1726883065.96434: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.96485: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.96598: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.96609: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.96616: when evaluation is False, skipping this task 33277 1726883065.96625: _execute() done 33277 1726883065.96647: dumping result to json 33277 1726883065.96650: done dumping result, returning 33277 1726883065.96706: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affc7ec-ae25-6628-6da4-0000000000b9] 33277 1726883065.96709: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000b9 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.96858: no more pending results, returning what we have 33277 1726883065.96862: results queue empty 33277 1726883065.96864: checking for any_errors_fatal 33277 1726883065.96869: done checking for any_errors_fatal 33277 1726883065.96870: checking for max_fail_percentage 33277 1726883065.96872: done checking for max_fail_percentage 33277 1726883065.96873: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.96874: done checking to see if all hosts have failed 33277 1726883065.96875: getting the remaining hosts for this loop 33277 1726883065.96877: done getting the remaining hosts for this loop 33277 1726883065.96881: getting the next task for host managed_node2 33277 1726883065.96890: done getting next task for host managed_node2 33277 1726883065.96894: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 33277 1726883065.96897: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.96921: getting variables 33277 1726883065.96925: in VariableManager get_vars() 33277 1726883065.96976: Calling all_inventory to load vars for managed_node2 33277 1726883065.96979: Calling groups_inventory to load vars for managed_node2 33277 1726883065.96981: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.96997: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.97000: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.97003: Calling groups_plugins_play to load vars for managed_node2 33277 1726883065.97518: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000b9 33277 1726883065.97521: WORKER PROCESS EXITING 33277 1726883065.97547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883065.97812: done with get_vars() 33277 1726883065.97821: done getting variables 33277 1726883065.97882: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:44:25 -0400 (0:00:00.040) 0:00:09.167 ****** 33277 1726883065.97927: entering _queue_task() for managed_node2/package 33277 1726883065.98190: worker is 1 (out of 1 available) 33277 1726883065.98204: exiting _queue_task() for managed_node2/package 33277 1726883065.98216: done queuing things up, now waiting for results queue to drain 33277 1726883065.98218: waiting for pending results... 33277 1726883065.98503: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 33277 1726883065.98635: in run() - task 0affc7ec-ae25-6628-6da4-0000000000ba 33277 1726883065.98655: variable 'ansible_search_path' from source: unknown 33277 1726883065.98673: variable 'ansible_search_path' from source: unknown 33277 1726883065.98718: calling self._execute() 33277 1726883065.98824: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883065.98837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883065.98891: variable 'omit' from source: magic vars 33277 1726883065.99259: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.99275: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883065.99413: variable 'ansible_distribution_major_version' from source: facts 33277 1726883065.99430: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883065.99543: when evaluation is False, skipping this task 33277 1726883065.99548: _execute() done 33277 1726883065.99550: dumping result to json 33277 1726883065.99552: done dumping result, returning 33277 1726883065.99555: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affc7ec-ae25-6628-6da4-0000000000ba] 33277 1726883065.99557: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000ba 33277 1726883065.99632: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000ba 33277 1726883065.99635: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883065.99687: no more pending results, returning what we have 33277 1726883065.99691: results queue empty 33277 1726883065.99692: checking for any_errors_fatal 33277 1726883065.99701: done checking for any_errors_fatal 33277 1726883065.99702: checking for max_fail_percentage 33277 1726883065.99704: done checking for max_fail_percentage 33277 1726883065.99705: checking to see if all hosts have failed and the running result is not ok 33277 1726883065.99706: done checking to see if all hosts have failed 33277 1726883065.99707: getting the remaining hosts for this loop 33277 1726883065.99708: done getting the remaining hosts for this loop 33277 1726883065.99712: getting the next task for host managed_node2 33277 1726883065.99718: done getting next task for host managed_node2 33277 1726883065.99724: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 33277 1726883065.99727: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883065.99749: getting variables 33277 1726883065.99750: in VariableManager get_vars() 33277 1726883065.99800: Calling all_inventory to load vars for managed_node2 33277 1726883065.99803: Calling groups_inventory to load vars for managed_node2 33277 1726883065.99806: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883065.99818: Calling all_plugins_play to load vars for managed_node2 33277 1726883065.99821: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883065.99931: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.00210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.00482: done with get_vars() 33277 1726883066.00495: done getting variables 33277 1726883066.00554: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:44:26 -0400 (0:00:00.026) 0:00:09.194 ****** 33277 1726883066.00593: entering _queue_task() for managed_node2/service 33277 1726883066.00953: worker is 1 (out of 1 available) 33277 1726883066.00964: exiting _queue_task() for managed_node2/service 33277 1726883066.00973: done queuing things up, now waiting for results queue to drain 33277 1726883066.00975: waiting for pending results... 33277 1726883066.01339: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 33277 1726883066.01344: in run() - task 0affc7ec-ae25-6628-6da4-0000000000bb 33277 1726883066.01348: variable 'ansible_search_path' from source: unknown 33277 1726883066.01358: variable 'ansible_search_path' from source: unknown 33277 1726883066.01396: calling self._execute() 33277 1726883066.01491: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.01504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.01517: variable 'omit' from source: magic vars 33277 1726883066.01992: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.02023: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.02232: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.02239: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.02242: when evaluation is False, skipping this task 33277 1726883066.02244: _execute() done 33277 1726883066.02246: dumping result to json 33277 1726883066.02248: done dumping result, returning 33277 1726883066.02251: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-6628-6da4-0000000000bb] 33277 1726883066.02253: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000bb 33277 1726883066.02328: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000bb 33277 1726883066.02331: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883066.02391: no more pending results, returning what we have 33277 1726883066.02395: results queue empty 33277 1726883066.02396: checking for any_errors_fatal 33277 1726883066.02403: done checking for any_errors_fatal 33277 1726883066.02404: checking for max_fail_percentage 33277 1726883066.02406: done checking for max_fail_percentage 33277 1726883066.02407: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.02408: done checking to see if all hosts have failed 33277 1726883066.02409: getting the remaining hosts for this loop 33277 1726883066.02410: done getting the remaining hosts for this loop 33277 1726883066.02415: getting the next task for host managed_node2 33277 1726883066.02421: done getting next task for host managed_node2 33277 1726883066.02427: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 33277 1726883066.02430: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883066.02460: getting variables 33277 1726883066.02462: in VariableManager get_vars() 33277 1726883066.02512: Calling all_inventory to load vars for managed_node2 33277 1726883066.02515: Calling groups_inventory to load vars for managed_node2 33277 1726883066.02518: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.02567: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.02570: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.02574: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.02980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.03253: done with get_vars() 33277 1726883066.03262: done getting variables 33277 1726883066.03331: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:44:26 -0400 (0:00:00.027) 0:00:09.221 ****** 33277 1726883066.03360: entering _queue_task() for managed_node2/service 33277 1726883066.03606: worker is 1 (out of 1 available) 33277 1726883066.03620: exiting _queue_task() for managed_node2/service 33277 1726883066.03634: done queuing things up, now waiting for results queue to drain 33277 1726883066.03636: waiting for pending results... 33277 1726883066.03993: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 33277 1726883066.04068: in run() - task 0affc7ec-ae25-6628-6da4-0000000000bc 33277 1726883066.04101: variable 'ansible_search_path' from source: unknown 33277 1726883066.04200: variable 'ansible_search_path' from source: unknown 33277 1726883066.04204: calling self._execute() 33277 1726883066.04244: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.04256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.04270: variable 'omit' from source: magic vars 33277 1726883066.04674: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.04695: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.04831: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.04850: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.04863: when evaluation is False, skipping this task 33277 1726883066.04871: _execute() done 33277 1726883066.04878: dumping result to json 33277 1726883066.04888: done dumping result, returning 33277 1726883066.04899: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affc7ec-ae25-6628-6da4-0000000000bc] 33277 1726883066.04906: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000bc 33277 1726883066.05118: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000bc 33277 1726883066.05121: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33277 1726883066.05160: no more pending results, returning what we have 33277 1726883066.05164: results queue empty 33277 1726883066.05165: checking for any_errors_fatal 33277 1726883066.05176: done checking for any_errors_fatal 33277 1726883066.05177: checking for max_fail_percentage 33277 1726883066.05178: done checking for max_fail_percentage 33277 1726883066.05179: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.05180: done checking to see if all hosts have failed 33277 1726883066.05181: getting the remaining hosts for this loop 33277 1726883066.05182: done getting the remaining hosts for this loop 33277 1726883066.05188: getting the next task for host managed_node2 33277 1726883066.05192: done getting next task for host managed_node2 33277 1726883066.05196: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 33277 1726883066.05198: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883066.05216: getting variables 33277 1726883066.05217: in VariableManager get_vars() 33277 1726883066.05264: Calling all_inventory to load vars for managed_node2 33277 1726883066.05267: Calling groups_inventory to load vars for managed_node2 33277 1726883066.05270: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.05372: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.05376: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.05381: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.05601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.05899: done with get_vars() 33277 1726883066.05909: done getting variables 33277 1726883066.05976: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:44:26 -0400 (0:00:00.026) 0:00:09.248 ****** 33277 1726883066.06012: entering _queue_task() for managed_node2/service 33277 1726883066.06449: worker is 1 (out of 1 available) 33277 1726883066.06460: exiting _queue_task() for managed_node2/service 33277 1726883066.06470: done queuing things up, now waiting for results queue to drain 33277 1726883066.06471: waiting for pending results... 33277 1726883066.06737: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 33277 1726883066.06829: in run() - task 0affc7ec-ae25-6628-6da4-0000000000bd 33277 1726883066.06852: variable 'ansible_search_path' from source: unknown 33277 1726883066.06861: variable 'ansible_search_path' from source: unknown 33277 1726883066.06909: calling self._execute() 33277 1726883066.07008: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.07048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.07052: variable 'omit' from source: magic vars 33277 1726883066.07464: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.07527: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.07647: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.07658: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.07664: when evaluation is False, skipping this task 33277 1726883066.07670: _execute() done 33277 1726883066.07677: dumping result to json 33277 1726883066.07684: done dumping result, returning 33277 1726883066.07802: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affc7ec-ae25-6628-6da4-0000000000bd] 33277 1726883066.07806: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000bd 33277 1726883066.07880: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000bd 33277 1726883066.07883: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883066.07940: no more pending results, returning what we have 33277 1726883066.07951: results queue empty 33277 1726883066.07952: checking for any_errors_fatal 33277 1726883066.07962: done checking for any_errors_fatal 33277 1726883066.07963: checking for max_fail_percentage 33277 1726883066.07965: done checking for max_fail_percentage 33277 1726883066.07966: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.07967: done checking to see if all hosts have failed 33277 1726883066.07968: getting the remaining hosts for this loop 33277 1726883066.07969: done getting the remaining hosts for this loop 33277 1726883066.07974: getting the next task for host managed_node2 33277 1726883066.07980: done getting next task for host managed_node2 33277 1726883066.07984: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 33277 1726883066.07989: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883066.08128: getting variables 33277 1726883066.08131: in VariableManager get_vars() 33277 1726883066.08173: Calling all_inventory to load vars for managed_node2 33277 1726883066.08176: Calling groups_inventory to load vars for managed_node2 33277 1726883066.08179: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.08190: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.08193: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.08197: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.08455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.08719: done with get_vars() 33277 1726883066.08733: done getting variables 33277 1726883066.08806: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:44:26 -0400 (0:00:00.028) 0:00:09.276 ****** 33277 1726883066.08838: entering _queue_task() for managed_node2/service 33277 1726883066.09228: worker is 1 (out of 1 available) 33277 1726883066.09241: exiting _queue_task() for managed_node2/service 33277 1726883066.09250: done queuing things up, now waiting for results queue to drain 33277 1726883066.09252: waiting for pending results... 33277 1726883066.09540: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 33277 1726883066.09544: in run() - task 0affc7ec-ae25-6628-6da4-0000000000be 33277 1726883066.09564: variable 'ansible_search_path' from source: unknown 33277 1726883066.09572: variable 'ansible_search_path' from source: unknown 33277 1726883066.09624: calling self._execute() 33277 1726883066.09720: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.09807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.09815: variable 'omit' from source: magic vars 33277 1726883066.10173: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.10194: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.10334: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.10359: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.10362: when evaluation is False, skipping this task 33277 1726883066.10430: _execute() done 33277 1726883066.10433: dumping result to json 33277 1726883066.10436: done dumping result, returning 33277 1726883066.10439: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affc7ec-ae25-6628-6da4-0000000000be] 33277 1726883066.10442: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000be 33277 1726883066.10520: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000be skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33277 1726883066.10569: no more pending results, returning what we have 33277 1726883066.10573: results queue empty 33277 1726883066.10574: checking for any_errors_fatal 33277 1726883066.10581: done checking for any_errors_fatal 33277 1726883066.10581: checking for max_fail_percentage 33277 1726883066.10583: done checking for max_fail_percentage 33277 1726883066.10584: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.10588: done checking to see if all hosts have failed 33277 1726883066.10589: getting the remaining hosts for this loop 33277 1726883066.10591: done getting the remaining hosts for this loop 33277 1726883066.10595: getting the next task for host managed_node2 33277 1726883066.10601: done getting next task for host managed_node2 33277 1726883066.10605: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 33277 1726883066.10608: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883066.10632: getting variables 33277 1726883066.10634: in VariableManager get_vars() 33277 1726883066.10689: Calling all_inventory to load vars for managed_node2 33277 1726883066.10692: Calling groups_inventory to load vars for managed_node2 33277 1726883066.10695: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.10708: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.10711: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.10714: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.11153: WORKER PROCESS EXITING 33277 1726883066.11176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.11436: done with get_vars() 33277 1726883066.11445: done getting variables 33277 1726883066.11508: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:44:26 -0400 (0:00:00.026) 0:00:09.303 ****** 33277 1726883066.11541: entering _queue_task() for managed_node2/copy 33277 1726883066.11783: worker is 1 (out of 1 available) 33277 1726883066.11915: exiting _queue_task() for managed_node2/copy 33277 1726883066.11926: done queuing things up, now waiting for results queue to drain 33277 1726883066.11928: waiting for pending results... 33277 1726883066.12098: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 33277 1726883066.12259: in run() - task 0affc7ec-ae25-6628-6da4-0000000000bf 33277 1726883066.12263: variable 'ansible_search_path' from source: unknown 33277 1726883066.12267: variable 'ansible_search_path' from source: unknown 33277 1726883066.12328: calling self._execute() 33277 1726883066.12408: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.12418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.12458: variable 'omit' from source: magic vars 33277 1726883066.12832: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.12853: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.12990: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.13116: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.13119: when evaluation is False, skipping this task 33277 1726883066.13124: _execute() done 33277 1726883066.13127: dumping result to json 33277 1726883066.13129: done dumping result, returning 33277 1726883066.13132: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affc7ec-ae25-6628-6da4-0000000000bf] 33277 1726883066.13135: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000bf 33277 1726883066.13209: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000bf 33277 1726883066.13212: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883066.13263: no more pending results, returning what we have 33277 1726883066.13267: results queue empty 33277 1726883066.13268: checking for any_errors_fatal 33277 1726883066.13276: done checking for any_errors_fatal 33277 1726883066.13277: checking for max_fail_percentage 33277 1726883066.13278: done checking for max_fail_percentage 33277 1726883066.13279: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.13280: done checking to see if all hosts have failed 33277 1726883066.13281: getting the remaining hosts for this loop 33277 1726883066.13282: done getting the remaining hosts for this loop 33277 1726883066.13289: getting the next task for host managed_node2 33277 1726883066.13296: done getting next task for host managed_node2 33277 1726883066.13300: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 33277 1726883066.13302: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883066.13324: getting variables 33277 1726883066.13327: in VariableManager get_vars() 33277 1726883066.13377: Calling all_inventory to load vars for managed_node2 33277 1726883066.13380: Calling groups_inventory to load vars for managed_node2 33277 1726883066.13383: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.13399: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.13401: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.13404: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.14053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.14672: done with get_vars() 33277 1726883066.14683: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:44:26 -0400 (0:00:00.033) 0:00:09.337 ****** 33277 1726883066.14938: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 33277 1726883066.15419: worker is 1 (out of 1 available) 33277 1726883066.15436: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 33277 1726883066.15449: done queuing things up, now waiting for results queue to drain 33277 1726883066.15451: waiting for pending results... 33277 1726883066.16015: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 33277 1726883066.16024: in run() - task 0affc7ec-ae25-6628-6da4-0000000000c0 33277 1726883066.16028: variable 'ansible_search_path' from source: unknown 33277 1726883066.16030: variable 'ansible_search_path' from source: unknown 33277 1726883066.16033: calling self._execute() 33277 1726883066.16035: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.16038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.16114: variable 'omit' from source: magic vars 33277 1726883066.16631: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.16642: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.16758: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.16762: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.16765: when evaluation is False, skipping this task 33277 1726883066.16768: _execute() done 33277 1726883066.16771: dumping result to json 33277 1726883066.16773: done dumping result, returning 33277 1726883066.16817: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affc7ec-ae25-6628-6da4-0000000000c0] 33277 1726883066.16820: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000c0 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883066.17052: no more pending results, returning what we have 33277 1726883066.17055: results queue empty 33277 1726883066.17056: checking for any_errors_fatal 33277 1726883066.17062: done checking for any_errors_fatal 33277 1726883066.17063: checking for max_fail_percentage 33277 1726883066.17065: done checking for max_fail_percentage 33277 1726883066.17066: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.17067: done checking to see if all hosts have failed 33277 1726883066.17067: getting the remaining hosts for this loop 33277 1726883066.17068: done getting the remaining hosts for this loop 33277 1726883066.17072: getting the next task for host managed_node2 33277 1726883066.17077: done getting next task for host managed_node2 33277 1726883066.17080: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 33277 1726883066.17083: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883066.17099: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000c0 33277 1726883066.17103: WORKER PROCESS EXITING 33277 1726883066.17113: getting variables 33277 1726883066.17115: in VariableManager get_vars() 33277 1726883066.17158: Calling all_inventory to load vars for managed_node2 33277 1726883066.17160: Calling groups_inventory to load vars for managed_node2 33277 1726883066.17163: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.17172: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.17174: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.17177: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.17628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.17868: done with get_vars() 33277 1726883066.17878: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:44:26 -0400 (0:00:00.030) 0:00:09.367 ****** 33277 1726883066.17960: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 33277 1726883066.18277: worker is 1 (out of 1 available) 33277 1726883066.18296: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 33277 1726883066.18309: done queuing things up, now waiting for results queue to drain 33277 1726883066.18310: waiting for pending results... 33277 1726883066.18842: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 33277 1726883066.19059: in run() - task 0affc7ec-ae25-6628-6da4-0000000000c1 33277 1726883066.19082: variable 'ansible_search_path' from source: unknown 33277 1726883066.19268: variable 'ansible_search_path' from source: unknown 33277 1726883066.19272: calling self._execute() 33277 1726883066.19355: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.19476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.19495: variable 'omit' from source: magic vars 33277 1726883066.19972: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.20028: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.20115: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.20129: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.20141: when evaluation is False, skipping this task 33277 1726883066.20149: _execute() done 33277 1726883066.20155: dumping result to json 33277 1726883066.20163: done dumping result, returning 33277 1726883066.20175: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affc7ec-ae25-6628-6da4-0000000000c1] 33277 1726883066.20228: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000c1 33277 1726883066.20528: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000c1 33277 1726883066.20532: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883066.20574: no more pending results, returning what we have 33277 1726883066.20577: results queue empty 33277 1726883066.20578: checking for any_errors_fatal 33277 1726883066.20584: done checking for any_errors_fatal 33277 1726883066.20585: checking for max_fail_percentage 33277 1726883066.20586: done checking for max_fail_percentage 33277 1726883066.20587: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.20588: done checking to see if all hosts have failed 33277 1726883066.20589: getting the remaining hosts for this loop 33277 1726883066.20590: done getting the remaining hosts for this loop 33277 1726883066.20594: getting the next task for host managed_node2 33277 1726883066.20600: done getting next task for host managed_node2 33277 1726883066.20604: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 33277 1726883066.20607: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883066.20627: getting variables 33277 1726883066.20629: in VariableManager get_vars() 33277 1726883066.20677: Calling all_inventory to load vars for managed_node2 33277 1726883066.20680: Calling groups_inventory to load vars for managed_node2 33277 1726883066.20682: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.20691: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.20694: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.20697: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.20913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.21164: done with get_vars() 33277 1726883066.21175: done getting variables 33277 1726883066.21236: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:44:26 -0400 (0:00:00.033) 0:00:09.401 ****** 33277 1726883066.21270: entering _queue_task() for managed_node2/debug 33277 1726883066.21735: worker is 1 (out of 1 available) 33277 1726883066.21745: exiting _queue_task() for managed_node2/debug 33277 1726883066.21755: done queuing things up, now waiting for results queue to drain 33277 1726883066.21756: waiting for pending results... 33277 1726883066.21855: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 33277 1726883066.22052: in run() - task 0affc7ec-ae25-6628-6da4-0000000000c2 33277 1726883066.22074: variable 'ansible_search_path' from source: unknown 33277 1726883066.22082: variable 'ansible_search_path' from source: unknown 33277 1726883066.22132: calling self._execute() 33277 1726883066.22239: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.22250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.22263: variable 'omit' from source: magic vars 33277 1726883066.22882: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.22909: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.23074: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.23078: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.23081: when evaluation is False, skipping this task 33277 1726883066.23084: _execute() done 33277 1726883066.23086: dumping result to json 33277 1726883066.23089: done dumping result, returning 33277 1726883066.23092: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affc7ec-ae25-6628-6da4-0000000000c2] 33277 1726883066.23095: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000c2 33277 1726883066.23248: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000c2 33277 1726883066.23251: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "ansible_distribution_major_version == '7'" } 33277 1726883066.23330: no more pending results, returning what we have 33277 1726883066.23333: results queue empty 33277 1726883066.23334: checking for any_errors_fatal 33277 1726883066.23339: done checking for any_errors_fatal 33277 1726883066.23339: checking for max_fail_percentage 33277 1726883066.23341: done checking for max_fail_percentage 33277 1726883066.23342: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.23343: done checking to see if all hosts have failed 33277 1726883066.23343: getting the remaining hosts for this loop 33277 1726883066.23344: done getting the remaining hosts for this loop 33277 1726883066.23348: getting the next task for host managed_node2 33277 1726883066.23352: done getting next task for host managed_node2 33277 1726883066.23356: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 33277 1726883066.23358: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883066.23373: getting variables 33277 1726883066.23374: in VariableManager get_vars() 33277 1726883066.23415: Calling all_inventory to load vars for managed_node2 33277 1726883066.23418: Calling groups_inventory to load vars for managed_node2 33277 1726883066.23421: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.23433: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.23436: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.23439: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.23681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.23920: done with get_vars() 33277 1726883066.23933: done getting variables 33277 1726883066.23993: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:44:26 -0400 (0:00:00.027) 0:00:09.428 ****** 33277 1726883066.24028: entering _queue_task() for managed_node2/debug 33277 1726883066.24459: worker is 1 (out of 1 available) 33277 1726883066.24470: exiting _queue_task() for managed_node2/debug 33277 1726883066.24478: done queuing things up, now waiting for results queue to drain 33277 1726883066.24480: waiting for pending results... 33277 1726883066.24718: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 33277 1726883066.25129: in run() - task 0affc7ec-ae25-6628-6da4-0000000000c3 33277 1726883066.25132: variable 'ansible_search_path' from source: unknown 33277 1726883066.25135: variable 'ansible_search_path' from source: unknown 33277 1726883066.25138: calling self._execute() 33277 1726883066.25141: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.25143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.25146: variable 'omit' from source: magic vars 33277 1726883066.26017: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.26050: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.26213: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.26263: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.26272: when evaluation is False, skipping this task 33277 1726883066.26280: _execute() done 33277 1726883066.26287: dumping result to json 33277 1726883066.26296: done dumping result, returning 33277 1726883066.26309: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affc7ec-ae25-6628-6da4-0000000000c3] 33277 1726883066.26320: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000c3 skipping: [managed_node2] => { "false_condition": "ansible_distribution_major_version == '7'" } 33277 1726883066.26474: no more pending results, returning what we have 33277 1726883066.26479: results queue empty 33277 1726883066.26480: checking for any_errors_fatal 33277 1726883066.26488: done checking for any_errors_fatal 33277 1726883066.26489: checking for max_fail_percentage 33277 1726883066.26491: done checking for max_fail_percentage 33277 1726883066.26492: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.26493: done checking to see if all hosts have failed 33277 1726883066.26494: getting the remaining hosts for this loop 33277 1726883066.26495: done getting the remaining hosts for this loop 33277 1726883066.26499: getting the next task for host managed_node2 33277 1726883066.26506: done getting next task for host managed_node2 33277 1726883066.26510: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 33277 1726883066.26513: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883066.26538: getting variables 33277 1726883066.26540: in VariableManager get_vars() 33277 1726883066.26593: Calling all_inventory to load vars for managed_node2 33277 1726883066.26596: Calling groups_inventory to load vars for managed_node2 33277 1726883066.26598: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.26612: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.26615: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.26619: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.26737: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000c3 33277 1726883066.26740: WORKER PROCESS EXITING 33277 1726883066.27107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.27388: done with get_vars() 33277 1726883066.27397: done getting variables 33277 1726883066.27450: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:44:26 -0400 (0:00:00.034) 0:00:09.463 ****** 33277 1726883066.27481: entering _queue_task() for managed_node2/debug 33277 1726883066.27730: worker is 1 (out of 1 available) 33277 1726883066.27743: exiting _queue_task() for managed_node2/debug 33277 1726883066.27757: done queuing things up, now waiting for results queue to drain 33277 1726883066.27758: waiting for pending results... 33277 1726883066.28037: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 33277 1726883066.28294: in run() - task 0affc7ec-ae25-6628-6da4-0000000000c4 33277 1726883066.28375: variable 'ansible_search_path' from source: unknown 33277 1726883066.28424: variable 'ansible_search_path' from source: unknown 33277 1726883066.28459: calling self._execute() 33277 1726883066.28640: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.28644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.28646: variable 'omit' from source: magic vars 33277 1726883066.28995: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.29013: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.29143: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.29154: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.29162: when evaluation is False, skipping this task 33277 1726883066.29169: _execute() done 33277 1726883066.29180: dumping result to json 33277 1726883066.29189: done dumping result, returning 33277 1726883066.29201: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affc7ec-ae25-6628-6da4-0000000000c4] 33277 1726883066.29211: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000c4 33277 1726883066.29462: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000c4 33277 1726883066.29465: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "ansible_distribution_major_version == '7'" } 33277 1726883066.29508: no more pending results, returning what we have 33277 1726883066.29512: results queue empty 33277 1726883066.29513: checking for any_errors_fatal 33277 1726883066.29520: done checking for any_errors_fatal 33277 1726883066.29521: checking for max_fail_percentage 33277 1726883066.29525: done checking for max_fail_percentage 33277 1726883066.29526: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.29527: done checking to see if all hosts have failed 33277 1726883066.29528: getting the remaining hosts for this loop 33277 1726883066.29529: done getting the remaining hosts for this loop 33277 1726883066.29533: getting the next task for host managed_node2 33277 1726883066.29539: done getting next task for host managed_node2 33277 1726883066.29543: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 33277 1726883066.29545: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883066.29565: getting variables 33277 1726883066.29567: in VariableManager get_vars() 33277 1726883066.29614: Calling all_inventory to load vars for managed_node2 33277 1726883066.29617: Calling groups_inventory to load vars for managed_node2 33277 1726883066.29620: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.29713: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.29717: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.29720: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.29927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.30169: done with get_vars() 33277 1726883066.30179: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:44:26 -0400 (0:00:00.027) 0:00:09.491 ****** 33277 1726883066.30277: entering _queue_task() for managed_node2/ping 33277 1726883066.30514: worker is 1 (out of 1 available) 33277 1726883066.30728: exiting _queue_task() for managed_node2/ping 33277 1726883066.30739: done queuing things up, now waiting for results queue to drain 33277 1726883066.30741: waiting for pending results... 33277 1726883066.30817: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 33277 1726883066.30961: in run() - task 0affc7ec-ae25-6628-6da4-0000000000c5 33277 1726883066.30986: variable 'ansible_search_path' from source: unknown 33277 1726883066.30995: variable 'ansible_search_path' from source: unknown 33277 1726883066.31039: calling self._execute() 33277 1726883066.31130: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.31143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.31188: variable 'omit' from source: magic vars 33277 1726883066.31552: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.31570: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.31696: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.31708: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.31728: when evaluation is False, skipping this task 33277 1726883066.31731: _execute() done 33277 1726883066.31735: dumping result to json 33277 1726883066.31840: done dumping result, returning 33277 1726883066.31844: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affc7ec-ae25-6628-6da4-0000000000c5] 33277 1726883066.31846: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000c5 33277 1726883066.31913: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000c5 33277 1726883066.31917: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883066.31990: no more pending results, returning what we have 33277 1726883066.31994: results queue empty 33277 1726883066.31996: checking for any_errors_fatal 33277 1726883066.32002: done checking for any_errors_fatal 33277 1726883066.32003: checking for max_fail_percentage 33277 1726883066.32004: done checking for max_fail_percentage 33277 1726883066.32006: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.32007: done checking to see if all hosts have failed 33277 1726883066.32008: getting the remaining hosts for this loop 33277 1726883066.32009: done getting the remaining hosts for this loop 33277 1726883066.32013: getting the next task for host managed_node2 33277 1726883066.32025: done getting next task for host managed_node2 33277 1726883066.32027: ^ task is: TASK: meta (role_complete) 33277 1726883066.32031: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883066.32055: getting variables 33277 1726883066.32057: in VariableManager get_vars() 33277 1726883066.32106: Calling all_inventory to load vars for managed_node2 33277 1726883066.32109: Calling groups_inventory to load vars for managed_node2 33277 1726883066.32112: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.32228: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.32232: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.32236: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.32485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.32725: done with get_vars() 33277 1726883066.32734: done getting variables 33277 1726883066.32813: done queuing things up, now waiting for results queue to drain 33277 1726883066.32816: results queue empty 33277 1726883066.32816: checking for any_errors_fatal 33277 1726883066.32819: done checking for any_errors_fatal 33277 1726883066.32819: checking for max_fail_percentage 33277 1726883066.32820: done checking for max_fail_percentage 33277 1726883066.32823: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.32824: done checking to see if all hosts have failed 33277 1726883066.32825: getting the remaining hosts for this loop 33277 1726883066.32825: done getting the remaining hosts for this loop 33277 1726883066.32828: getting the next task for host managed_node2 33277 1726883066.32834: done getting next task for host managed_node2 33277 1726883066.32837: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 33277 1726883066.32840: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883066.32850: getting variables 33277 1726883066.32852: in VariableManager get_vars() 33277 1726883066.32871: Calling all_inventory to load vars for managed_node2 33277 1726883066.32873: Calling groups_inventory to load vars for managed_node2 33277 1726883066.32875: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.32880: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.32882: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.32885: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.33042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.33277: done with get_vars() 33277 1726883066.33286: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:44:26 -0400 (0:00:00.030) 0:00:09.521 ****** 33277 1726883066.33365: entering _queue_task() for managed_node2/include_tasks 33277 1726883066.33600: worker is 1 (out of 1 available) 33277 1726883066.33613: exiting _queue_task() for managed_node2/include_tasks 33277 1726883066.33827: done queuing things up, now waiting for results queue to drain 33277 1726883066.33830: waiting for pending results... 33277 1726883066.33896: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 33277 1726883066.34164: in run() - task 0affc7ec-ae25-6628-6da4-0000000000fd 33277 1726883066.34168: variable 'ansible_search_path' from source: unknown 33277 1726883066.34172: variable 'ansible_search_path' from source: unknown 33277 1726883066.34175: calling self._execute() 33277 1726883066.34211: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.34226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.34241: variable 'omit' from source: magic vars 33277 1726883066.34632: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.34650: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.34776: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.34788: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.34796: when evaluation is False, skipping this task 33277 1726883066.34804: _execute() done 33277 1726883066.34816: dumping result to json 33277 1726883066.34826: done dumping result, returning 33277 1726883066.34839: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affc7ec-ae25-6628-6da4-0000000000fd] 33277 1726883066.34850: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000fd skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883066.35106: no more pending results, returning what we have 33277 1726883066.35111: results queue empty 33277 1726883066.35112: checking for any_errors_fatal 33277 1726883066.35114: done checking for any_errors_fatal 33277 1726883066.35115: checking for max_fail_percentage 33277 1726883066.35116: done checking for max_fail_percentage 33277 1726883066.35117: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.35118: done checking to see if all hosts have failed 33277 1726883066.35119: getting the remaining hosts for this loop 33277 1726883066.35120: done getting the remaining hosts for this loop 33277 1726883066.35128: getting the next task for host managed_node2 33277 1726883066.35135: done getting next task for host managed_node2 33277 1726883066.35140: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 33277 1726883066.35144: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883066.35170: getting variables 33277 1726883066.35172: in VariableManager get_vars() 33277 1726883066.35219: Calling all_inventory to load vars for managed_node2 33277 1726883066.35319: Calling groups_inventory to load vars for managed_node2 33277 1726883066.35326: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.35334: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000fd 33277 1726883066.35338: WORKER PROCESS EXITING 33277 1726883066.35346: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.35356: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.35361: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.35709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.36048: done with get_vars() 33277 1726883066.36057: done getting variables 33277 1726883066.36126: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:44:26 -0400 (0:00:00.027) 0:00:09.549 ****** 33277 1726883066.36158: entering _queue_task() for managed_node2/debug 33277 1726883066.36444: worker is 1 (out of 1 available) 33277 1726883066.36458: exiting _queue_task() for managed_node2/debug 33277 1726883066.36470: done queuing things up, now waiting for results queue to drain 33277 1726883066.36472: waiting for pending results... 33277 1726883066.36865: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 33277 1726883066.36983: in run() - task 0affc7ec-ae25-6628-6da4-0000000000fe 33277 1726883066.36995: variable 'ansible_search_path' from source: unknown 33277 1726883066.37067: variable 'ansible_search_path' from source: unknown 33277 1726883066.37071: calling self._execute() 33277 1726883066.37171: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.37191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.37293: variable 'omit' from source: magic vars 33277 1726883066.37678: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.37709: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.37938: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.37945: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.37949: when evaluation is False, skipping this task 33277 1726883066.37952: _execute() done 33277 1726883066.37959: dumping result to json 33277 1726883066.37962: done dumping result, returning 33277 1726883066.37973: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affc7ec-ae25-6628-6da4-0000000000fe] 33277 1726883066.37992: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000fe skipping: [managed_node2] => { "false_condition": "ansible_distribution_major_version == '7'" } 33277 1726883066.38150: no more pending results, returning what we have 33277 1726883066.38154: results queue empty 33277 1726883066.38155: checking for any_errors_fatal 33277 1726883066.38161: done checking for any_errors_fatal 33277 1726883066.38161: checking for max_fail_percentage 33277 1726883066.38163: done checking for max_fail_percentage 33277 1726883066.38164: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.38165: done checking to see if all hosts have failed 33277 1726883066.38165: getting the remaining hosts for this loop 33277 1726883066.38166: done getting the remaining hosts for this loop 33277 1726883066.38172: getting the next task for host managed_node2 33277 1726883066.38179: done getting next task for host managed_node2 33277 1726883066.38185: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 33277 1726883066.38191: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883066.38209: getting variables 33277 1726883066.38211: in VariableManager get_vars() 33277 1726883066.38253: Calling all_inventory to load vars for managed_node2 33277 1726883066.38255: Calling groups_inventory to load vars for managed_node2 33277 1726883066.38258: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.38266: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.38271: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.38276: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.38405: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000fe 33277 1726883066.38409: WORKER PROCESS EXITING 33277 1726883066.38420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.38569: done with get_vars() 33277 1726883066.38577: done getting variables 33277 1726883066.38625: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:44:26 -0400 (0:00:00.024) 0:00:09.574 ****** 33277 1726883066.38649: entering _queue_task() for managed_node2/fail 33277 1726883066.38841: worker is 1 (out of 1 available) 33277 1726883066.38856: exiting _queue_task() for managed_node2/fail 33277 1726883066.38867: done queuing things up, now waiting for results queue to drain 33277 1726883066.38868: waiting for pending results... 33277 1726883066.39050: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 33277 1726883066.39142: in run() - task 0affc7ec-ae25-6628-6da4-0000000000ff 33277 1726883066.39155: variable 'ansible_search_path' from source: unknown 33277 1726883066.39159: variable 'ansible_search_path' from source: unknown 33277 1726883066.39194: calling self._execute() 33277 1726883066.39266: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.39270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.39278: variable 'omit' from source: magic vars 33277 1726883066.39562: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.39573: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.39658: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.39662: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.39664: when evaluation is False, skipping this task 33277 1726883066.39668: _execute() done 33277 1726883066.39671: dumping result to json 33277 1726883066.39673: done dumping result, returning 33277 1726883066.39681: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affc7ec-ae25-6628-6da4-0000000000ff] 33277 1726883066.39684: sending task result for task 0affc7ec-ae25-6628-6da4-0000000000ff 33277 1726883066.39787: done sending task result for task 0affc7ec-ae25-6628-6da4-0000000000ff 33277 1726883066.39790: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883066.39847: no more pending results, returning what we have 33277 1726883066.39851: results queue empty 33277 1726883066.39852: checking for any_errors_fatal 33277 1726883066.39857: done checking for any_errors_fatal 33277 1726883066.39857: checking for max_fail_percentage 33277 1726883066.39859: done checking for max_fail_percentage 33277 1726883066.39860: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.39861: done checking to see if all hosts have failed 33277 1726883066.39861: getting the remaining hosts for this loop 33277 1726883066.39862: done getting the remaining hosts for this loop 33277 1726883066.39865: getting the next task for host managed_node2 33277 1726883066.39871: done getting next task for host managed_node2 33277 1726883066.39874: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 33277 1726883066.39878: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883066.39893: getting variables 33277 1726883066.39895: in VariableManager get_vars() 33277 1726883066.39943: Calling all_inventory to load vars for managed_node2 33277 1726883066.39946: Calling groups_inventory to load vars for managed_node2 33277 1726883066.39949: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.39958: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.39961: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.39964: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.40232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.40494: done with get_vars() 33277 1726883066.40504: done getting variables 33277 1726883066.40563: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:44:26 -0400 (0:00:00.019) 0:00:09.594 ****** 33277 1726883066.40604: entering _queue_task() for managed_node2/fail 33277 1726883066.40871: worker is 1 (out of 1 available) 33277 1726883066.40888: exiting _queue_task() for managed_node2/fail 33277 1726883066.40901: done queuing things up, now waiting for results queue to drain 33277 1726883066.40903: waiting for pending results... 33277 1726883066.41212: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 33277 1726883066.41314: in run() - task 0affc7ec-ae25-6628-6da4-000000000100 33277 1726883066.41332: variable 'ansible_search_path' from source: unknown 33277 1726883066.41336: variable 'ansible_search_path' from source: unknown 33277 1726883066.41368: calling self._execute() 33277 1726883066.41440: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.41446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.41454: variable 'omit' from source: magic vars 33277 1726883066.41748: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.41760: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.41847: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.41850: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.41853: when evaluation is False, skipping this task 33277 1726883066.41856: _execute() done 33277 1726883066.41858: dumping result to json 33277 1726883066.41862: done dumping result, returning 33277 1726883066.41872: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affc7ec-ae25-6628-6da4-000000000100] 33277 1726883066.41875: sending task result for task 0affc7ec-ae25-6628-6da4-000000000100 33277 1726883066.41973: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000100 33277 1726883066.41975: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883066.42031: no more pending results, returning what we have 33277 1726883066.42035: results queue empty 33277 1726883066.42036: checking for any_errors_fatal 33277 1726883066.42042: done checking for any_errors_fatal 33277 1726883066.42043: checking for max_fail_percentage 33277 1726883066.42045: done checking for max_fail_percentage 33277 1726883066.42046: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.42047: done checking to see if all hosts have failed 33277 1726883066.42047: getting the remaining hosts for this loop 33277 1726883066.42049: done getting the remaining hosts for this loop 33277 1726883066.42052: getting the next task for host managed_node2 33277 1726883066.42060: done getting next task for host managed_node2 33277 1726883066.42063: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 33277 1726883066.42067: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883066.42089: getting variables 33277 1726883066.42091: in VariableManager get_vars() 33277 1726883066.42133: Calling all_inventory to load vars for managed_node2 33277 1726883066.42136: Calling groups_inventory to load vars for managed_node2 33277 1726883066.42138: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.42146: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.42149: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.42156: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.42288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.42437: done with get_vars() 33277 1726883066.42446: done getting variables 33277 1726883066.42493: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:44:26 -0400 (0:00:00.019) 0:00:09.613 ****** 33277 1726883066.42517: entering _queue_task() for managed_node2/fail 33277 1726883066.42721: worker is 1 (out of 1 available) 33277 1726883066.42737: exiting _queue_task() for managed_node2/fail 33277 1726883066.42749: done queuing things up, now waiting for results queue to drain 33277 1726883066.42751: waiting for pending results... 33277 1726883066.42935: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 33277 1726883066.43036: in run() - task 0affc7ec-ae25-6628-6da4-000000000101 33277 1726883066.43050: variable 'ansible_search_path' from source: unknown 33277 1726883066.43057: variable 'ansible_search_path' from source: unknown 33277 1726883066.43118: calling self._execute() 33277 1726883066.43275: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.43280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.43282: variable 'omit' from source: magic vars 33277 1726883066.43728: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.43732: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.43781: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.43784: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.43790: when evaluation is False, skipping this task 33277 1726883066.43793: _execute() done 33277 1726883066.43795: dumping result to json 33277 1726883066.43798: done dumping result, returning 33277 1726883066.43806: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affc7ec-ae25-6628-6da4-000000000101] 33277 1726883066.43809: sending task result for task 0affc7ec-ae25-6628-6da4-000000000101 33277 1726883066.43929: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000101 33277 1726883066.43931: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883066.44067: no more pending results, returning what we have 33277 1726883066.44071: results queue empty 33277 1726883066.44072: checking for any_errors_fatal 33277 1726883066.44079: done checking for any_errors_fatal 33277 1726883066.44080: checking for max_fail_percentage 33277 1726883066.44082: done checking for max_fail_percentage 33277 1726883066.44083: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.44084: done checking to see if all hosts have failed 33277 1726883066.44085: getting the remaining hosts for this loop 33277 1726883066.44086: done getting the remaining hosts for this loop 33277 1726883066.44089: getting the next task for host managed_node2 33277 1726883066.44096: done getting next task for host managed_node2 33277 1726883066.44099: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 33277 1726883066.44103: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883066.44119: getting variables 33277 1726883066.44121: in VariableManager get_vars() 33277 1726883066.44166: Calling all_inventory to load vars for managed_node2 33277 1726883066.44169: Calling groups_inventory to load vars for managed_node2 33277 1726883066.44171: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.44180: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.44183: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.44186: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.44426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.44664: done with get_vars() 33277 1726883066.44675: done getting variables 33277 1726883066.44736: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:44:26 -0400 (0:00:00.022) 0:00:09.636 ****** 33277 1726883066.44767: entering _queue_task() for managed_node2/dnf 33277 1726883066.45079: worker is 1 (out of 1 available) 33277 1726883066.45142: exiting _queue_task() for managed_node2/dnf 33277 1726883066.45154: done queuing things up, now waiting for results queue to drain 33277 1726883066.45156: waiting for pending results... 33277 1726883066.45301: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 33277 1726883066.45396: in run() - task 0affc7ec-ae25-6628-6da4-000000000102 33277 1726883066.45409: variable 'ansible_search_path' from source: unknown 33277 1726883066.45413: variable 'ansible_search_path' from source: unknown 33277 1726883066.45445: calling self._execute() 33277 1726883066.45513: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.45517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.45528: variable 'omit' from source: magic vars 33277 1726883066.45813: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.45824: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.45907: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.45910: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.45913: when evaluation is False, skipping this task 33277 1726883066.45916: _execute() done 33277 1726883066.45918: dumping result to json 33277 1726883066.45923: done dumping result, returning 33277 1726883066.45932: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affc7ec-ae25-6628-6da4-000000000102] 33277 1726883066.45937: sending task result for task 0affc7ec-ae25-6628-6da4-000000000102 33277 1726883066.46038: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000102 33277 1726883066.46043: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883066.46100: no more pending results, returning what we have 33277 1726883066.46102: results queue empty 33277 1726883066.46103: checking for any_errors_fatal 33277 1726883066.46108: done checking for any_errors_fatal 33277 1726883066.46109: checking for max_fail_percentage 33277 1726883066.46110: done checking for max_fail_percentage 33277 1726883066.46111: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.46112: done checking to see if all hosts have failed 33277 1726883066.46113: getting the remaining hosts for this loop 33277 1726883066.46114: done getting the remaining hosts for this loop 33277 1726883066.46117: getting the next task for host managed_node2 33277 1726883066.46125: done getting next task for host managed_node2 33277 1726883066.46128: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 33277 1726883066.46132: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883066.46148: getting variables 33277 1726883066.46149: in VariableManager get_vars() 33277 1726883066.46191: Calling all_inventory to load vars for managed_node2 33277 1726883066.46193: Calling groups_inventory to load vars for managed_node2 33277 1726883066.46195: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.46201: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.46203: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.46205: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.46418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.46726: done with get_vars() 33277 1726883066.46737: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 33277 1726883066.46815: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:44:26 -0400 (0:00:00.021) 0:00:09.657 ****** 33277 1726883066.47008: entering _queue_task() for managed_node2/yum 33277 1726883066.47549: worker is 1 (out of 1 available) 33277 1726883066.47560: exiting _queue_task() for managed_node2/yum 33277 1726883066.47570: done queuing things up, now waiting for results queue to drain 33277 1726883066.47571: waiting for pending results... 33277 1726883066.47847: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 33277 1726883066.47854: in run() - task 0affc7ec-ae25-6628-6da4-000000000103 33277 1726883066.47857: variable 'ansible_search_path' from source: unknown 33277 1726883066.47860: variable 'ansible_search_path' from source: unknown 33277 1726883066.47894: calling self._execute() 33277 1726883066.47988: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.47993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.48057: variable 'omit' from source: magic vars 33277 1726883066.48395: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.48407: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.48545: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.48549: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.48551: when evaluation is False, skipping this task 33277 1726883066.48558: _execute() done 33277 1726883066.48561: dumping result to json 33277 1726883066.48564: done dumping result, returning 33277 1726883066.48576: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affc7ec-ae25-6628-6da4-000000000103] 33277 1726883066.48580: sending task result for task 0affc7ec-ae25-6628-6da4-000000000103 33277 1726883066.48679: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000103 33277 1726883066.48683: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883066.48843: no more pending results, returning what we have 33277 1726883066.48845: results queue empty 33277 1726883066.48846: checking for any_errors_fatal 33277 1726883066.48853: done checking for any_errors_fatal 33277 1726883066.48854: checking for max_fail_percentage 33277 1726883066.48855: done checking for max_fail_percentage 33277 1726883066.48856: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.48859: done checking to see if all hosts have failed 33277 1726883066.48860: getting the remaining hosts for this loop 33277 1726883066.48861: done getting the remaining hosts for this loop 33277 1726883066.48865: getting the next task for host managed_node2 33277 1726883066.48873: done getting next task for host managed_node2 33277 1726883066.48876: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 33277 1726883066.48879: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883066.48893: getting variables 33277 1726883066.48894: in VariableManager get_vars() 33277 1726883066.48941: Calling all_inventory to load vars for managed_node2 33277 1726883066.48943: Calling groups_inventory to load vars for managed_node2 33277 1726883066.48945: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.48952: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.48953: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.48955: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.49144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.49376: done with get_vars() 33277 1726883066.49667: done getting variables 33277 1726883066.49932: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:44:26 -0400 (0:00:00.030) 0:00:09.688 ****** 33277 1726883066.49968: entering _queue_task() for managed_node2/fail 33277 1726883066.50618: worker is 1 (out of 1 available) 33277 1726883066.50637: exiting _queue_task() for managed_node2/fail 33277 1726883066.50651: done queuing things up, now waiting for results queue to drain 33277 1726883066.50653: waiting for pending results... 33277 1726883066.51040: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 33277 1726883066.51147: in run() - task 0affc7ec-ae25-6628-6da4-000000000104 33277 1726883066.51170: variable 'ansible_search_path' from source: unknown 33277 1726883066.51194: variable 'ansible_search_path' from source: unknown 33277 1726883066.51302: calling self._execute() 33277 1726883066.51351: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.51363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.51378: variable 'omit' from source: magic vars 33277 1726883066.51831: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.51861: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.52010: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.52130: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.52133: when evaluation is False, skipping this task 33277 1726883066.52137: _execute() done 33277 1726883066.52140: dumping result to json 33277 1726883066.52142: done dumping result, returning 33277 1726883066.52145: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-6628-6da4-000000000104] 33277 1726883066.52148: sending task result for task 0affc7ec-ae25-6628-6da4-000000000104 33277 1726883066.52398: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000104 33277 1726883066.52401: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883066.52462: no more pending results, returning what we have 33277 1726883066.52466: results queue empty 33277 1726883066.52467: checking for any_errors_fatal 33277 1726883066.52474: done checking for any_errors_fatal 33277 1726883066.52475: checking for max_fail_percentage 33277 1726883066.52477: done checking for max_fail_percentage 33277 1726883066.52478: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.52479: done checking to see if all hosts have failed 33277 1726883066.52480: getting the remaining hosts for this loop 33277 1726883066.52481: done getting the remaining hosts for this loop 33277 1726883066.52486: getting the next task for host managed_node2 33277 1726883066.52495: done getting next task for host managed_node2 33277 1726883066.52500: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 33277 1726883066.52505: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883066.52529: getting variables 33277 1726883066.52531: in VariableManager get_vars() 33277 1726883066.52578: Calling all_inventory to load vars for managed_node2 33277 1726883066.52581: Calling groups_inventory to load vars for managed_node2 33277 1726883066.52583: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.52596: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.52599: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.52601: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.52944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.53186: done with get_vars() 33277 1726883066.53197: done getting variables 33277 1726883066.53261: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:44:26 -0400 (0:00:00.033) 0:00:09.721 ****** 33277 1726883066.53303: entering _queue_task() for managed_node2/package 33277 1726883066.53585: worker is 1 (out of 1 available) 33277 1726883066.53599: exiting _queue_task() for managed_node2/package 33277 1726883066.53724: done queuing things up, now waiting for results queue to drain 33277 1726883066.53726: waiting for pending results... 33277 1726883066.54042: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 33277 1726883066.54249: in run() - task 0affc7ec-ae25-6628-6da4-000000000105 33277 1726883066.54321: variable 'ansible_search_path' from source: unknown 33277 1726883066.54325: variable 'ansible_search_path' from source: unknown 33277 1726883066.54328: calling self._execute() 33277 1726883066.54374: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.54380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.54391: variable 'omit' from source: magic vars 33277 1726883066.54698: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.54712: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.54879: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.54884: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.54890: when evaluation is False, skipping this task 33277 1726883066.54893: _execute() done 33277 1726883066.54897: dumping result to json 33277 1726883066.54900: done dumping result, returning 33277 1726883066.54903: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affc7ec-ae25-6628-6da4-000000000105] 33277 1726883066.54906: sending task result for task 0affc7ec-ae25-6628-6da4-000000000105 33277 1726883066.54998: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000105 33277 1726883066.55001: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883066.55077: no more pending results, returning what we have 33277 1726883066.55081: results queue empty 33277 1726883066.55081: checking for any_errors_fatal 33277 1726883066.55090: done checking for any_errors_fatal 33277 1726883066.55090: checking for max_fail_percentage 33277 1726883066.55092: done checking for max_fail_percentage 33277 1726883066.55093: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.55094: done checking to see if all hosts have failed 33277 1726883066.55094: getting the remaining hosts for this loop 33277 1726883066.55095: done getting the remaining hosts for this loop 33277 1726883066.55099: getting the next task for host managed_node2 33277 1726883066.55104: done getting next task for host managed_node2 33277 1726883066.55108: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 33277 1726883066.55112: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883066.55130: getting variables 33277 1726883066.55132: in VariableManager get_vars() 33277 1726883066.55171: Calling all_inventory to load vars for managed_node2 33277 1726883066.55173: Calling groups_inventory to load vars for managed_node2 33277 1726883066.55175: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.55182: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.55184: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.55188: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.55338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.55569: done with get_vars() 33277 1726883066.55577: done getting variables 33277 1726883066.55649: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:44:26 -0400 (0:00:00.023) 0:00:09.745 ****** 33277 1726883066.55680: entering _queue_task() for managed_node2/package 33277 1726883066.55991: worker is 1 (out of 1 available) 33277 1726883066.56005: exiting _queue_task() for managed_node2/package 33277 1726883066.56016: done queuing things up, now waiting for results queue to drain 33277 1726883066.56018: waiting for pending results... 33277 1726883066.56280: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 33277 1726883066.56368: in run() - task 0affc7ec-ae25-6628-6da4-000000000106 33277 1726883066.56411: variable 'ansible_search_path' from source: unknown 33277 1726883066.56416: variable 'ansible_search_path' from source: unknown 33277 1726883066.56452: calling self._execute() 33277 1726883066.56527: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.56531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.56554: variable 'omit' from source: magic vars 33277 1726883066.56953: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.56961: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.57071: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.57075: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.57080: when evaluation is False, skipping this task 33277 1726883066.57083: _execute() done 33277 1726883066.57088: dumping result to json 33277 1726883066.57090: done dumping result, returning 33277 1726883066.57097: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affc7ec-ae25-6628-6da4-000000000106] 33277 1726883066.57101: sending task result for task 0affc7ec-ae25-6628-6da4-000000000106 33277 1726883066.57201: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000106 33277 1726883066.57204: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883066.57258: no more pending results, returning what we have 33277 1726883066.57262: results queue empty 33277 1726883066.57263: checking for any_errors_fatal 33277 1726883066.57267: done checking for any_errors_fatal 33277 1726883066.57268: checking for max_fail_percentage 33277 1726883066.57270: done checking for max_fail_percentage 33277 1726883066.57270: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.57271: done checking to see if all hosts have failed 33277 1726883066.57272: getting the remaining hosts for this loop 33277 1726883066.57273: done getting the remaining hosts for this loop 33277 1726883066.57276: getting the next task for host managed_node2 33277 1726883066.57283: done getting next task for host managed_node2 33277 1726883066.57289: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 33277 1726883066.57292: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883066.57309: getting variables 33277 1726883066.57310: in VariableManager get_vars() 33277 1726883066.57355: Calling all_inventory to load vars for managed_node2 33277 1726883066.57357: Calling groups_inventory to load vars for managed_node2 33277 1726883066.57359: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.57366: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.57367: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.57369: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.57544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.57861: done with get_vars() 33277 1726883066.57871: done getting variables 33277 1726883066.58170: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:44:26 -0400 (0:00:00.025) 0:00:09.770 ****** 33277 1726883066.58204: entering _queue_task() for managed_node2/package 33277 1726883066.58663: worker is 1 (out of 1 available) 33277 1726883066.58675: exiting _queue_task() for managed_node2/package 33277 1726883066.58685: done queuing things up, now waiting for results queue to drain 33277 1726883066.58687: waiting for pending results... 33277 1726883066.59248: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 33277 1726883066.59255: in run() - task 0affc7ec-ae25-6628-6da4-000000000107 33277 1726883066.59258: variable 'ansible_search_path' from source: unknown 33277 1726883066.59262: variable 'ansible_search_path' from source: unknown 33277 1726883066.59265: calling self._execute() 33277 1726883066.59343: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.59349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.59358: variable 'omit' from source: magic vars 33277 1726883066.59883: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.59889: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.59892: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.59910: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.59913: when evaluation is False, skipping this task 33277 1726883066.59916: _execute() done 33277 1726883066.59918: dumping result to json 33277 1726883066.59923: done dumping result, returning 33277 1726883066.59933: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affc7ec-ae25-6628-6da4-000000000107] 33277 1726883066.59936: sending task result for task 0affc7ec-ae25-6628-6da4-000000000107 33277 1726883066.60159: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000107 33277 1726883066.60163: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883066.60199: no more pending results, returning what we have 33277 1726883066.60203: results queue empty 33277 1726883066.60204: checking for any_errors_fatal 33277 1726883066.60209: done checking for any_errors_fatal 33277 1726883066.60209: checking for max_fail_percentage 33277 1726883066.60211: done checking for max_fail_percentage 33277 1726883066.60212: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.60213: done checking to see if all hosts have failed 33277 1726883066.60214: getting the remaining hosts for this loop 33277 1726883066.60215: done getting the remaining hosts for this loop 33277 1726883066.60218: getting the next task for host managed_node2 33277 1726883066.60226: done getting next task for host managed_node2 33277 1726883066.60230: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 33277 1726883066.60234: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883066.60251: getting variables 33277 1726883066.60252: in VariableManager get_vars() 33277 1726883066.60294: Calling all_inventory to load vars for managed_node2 33277 1726883066.60297: Calling groups_inventory to load vars for managed_node2 33277 1726883066.60299: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.60308: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.60311: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.60314: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.60535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.60773: done with get_vars() 33277 1726883066.60784: done getting variables 33277 1726883066.60847: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:44:26 -0400 (0:00:00.026) 0:00:09.797 ****** 33277 1726883066.60881: entering _queue_task() for managed_node2/service 33277 1726883066.61127: worker is 1 (out of 1 available) 33277 1726883066.61150: exiting _queue_task() for managed_node2/service 33277 1726883066.61162: done queuing things up, now waiting for results queue to drain 33277 1726883066.61163: waiting for pending results... 33277 1726883066.61539: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 33277 1726883066.61545: in run() - task 0affc7ec-ae25-6628-6da4-000000000108 33277 1726883066.61548: variable 'ansible_search_path' from source: unknown 33277 1726883066.61551: variable 'ansible_search_path' from source: unknown 33277 1726883066.61596: calling self._execute() 33277 1726883066.61700: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.61713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.61734: variable 'omit' from source: magic vars 33277 1726883066.62335: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.62359: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.62538: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.62542: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.62551: when evaluation is False, skipping this task 33277 1726883066.62624: _execute() done 33277 1726883066.62630: dumping result to json 33277 1726883066.62633: done dumping result, returning 33277 1726883066.62635: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-6628-6da4-000000000108] 33277 1726883066.62638: sending task result for task 0affc7ec-ae25-6628-6da4-000000000108 33277 1726883066.62928: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000108 33277 1726883066.62932: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883066.62969: no more pending results, returning what we have 33277 1726883066.62972: results queue empty 33277 1726883066.62973: checking for any_errors_fatal 33277 1726883066.62978: done checking for any_errors_fatal 33277 1726883066.62978: checking for max_fail_percentage 33277 1726883066.62980: done checking for max_fail_percentage 33277 1726883066.62981: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.62981: done checking to see if all hosts have failed 33277 1726883066.62982: getting the remaining hosts for this loop 33277 1726883066.62983: done getting the remaining hosts for this loop 33277 1726883066.62989: getting the next task for host managed_node2 33277 1726883066.62995: done getting next task for host managed_node2 33277 1726883066.62999: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 33277 1726883066.63003: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883066.63020: getting variables 33277 1726883066.63024: in VariableManager get_vars() 33277 1726883066.63065: Calling all_inventory to load vars for managed_node2 33277 1726883066.63067: Calling groups_inventory to load vars for managed_node2 33277 1726883066.63069: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.63078: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.63081: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.63084: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.63812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.64350: done with get_vars() 33277 1726883066.64361: done getting variables 33277 1726883066.64660: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:44:26 -0400 (0:00:00.038) 0:00:09.835 ****** 33277 1726883066.64700: entering _queue_task() for managed_node2/service 33277 1726883066.65665: worker is 1 (out of 1 available) 33277 1726883066.65678: exiting _queue_task() for managed_node2/service 33277 1726883066.65691: done queuing things up, now waiting for results queue to drain 33277 1726883066.65693: waiting for pending results... 33277 1726883066.66166: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 33277 1726883066.66547: in run() - task 0affc7ec-ae25-6628-6da4-000000000109 33277 1726883066.66777: variable 'ansible_search_path' from source: unknown 33277 1726883066.66781: variable 'ansible_search_path' from source: unknown 33277 1726883066.66864: calling self._execute() 33277 1726883066.67128: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.67628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.67632: variable 'omit' from source: magic vars 33277 1726883066.68505: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.68653: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.68899: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.68911: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.68918: when evaluation is False, skipping this task 33277 1726883066.68933: _execute() done 33277 1726883066.68968: dumping result to json 33277 1726883066.68977: done dumping result, returning 33277 1726883066.68993: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affc7ec-ae25-6628-6da4-000000000109] 33277 1726883066.69038: sending task result for task 0affc7ec-ae25-6628-6da4-000000000109 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33277 1726883066.69483: no more pending results, returning what we have 33277 1726883066.69490: results queue empty 33277 1726883066.69492: checking for any_errors_fatal 33277 1726883066.69500: done checking for any_errors_fatal 33277 1726883066.69501: checking for max_fail_percentage 33277 1726883066.69502: done checking for max_fail_percentage 33277 1726883066.69504: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.69505: done checking to see if all hosts have failed 33277 1726883066.69505: getting the remaining hosts for this loop 33277 1726883066.69507: done getting the remaining hosts for this loop 33277 1726883066.69512: getting the next task for host managed_node2 33277 1726883066.69520: done getting next task for host managed_node2 33277 1726883066.69530: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 33277 1726883066.69536: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883066.69560: getting variables 33277 1726883066.69562: in VariableManager get_vars() 33277 1726883066.69619: Calling all_inventory to load vars for managed_node2 33277 1726883066.69874: Calling groups_inventory to load vars for managed_node2 33277 1726883066.69878: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.69892: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.69894: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.69898: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.70408: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000109 33277 1726883066.70412: WORKER PROCESS EXITING 33277 1726883066.70445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.71139: done with get_vars() 33277 1726883066.71156: done getting variables 33277 1726883066.71308: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:44:26 -0400 (0:00:00.066) 0:00:09.902 ****** 33277 1726883066.71393: entering _queue_task() for managed_node2/service 33277 1726883066.71988: worker is 1 (out of 1 available) 33277 1726883066.72003: exiting _queue_task() for managed_node2/service 33277 1726883066.72130: done queuing things up, now waiting for results queue to drain 33277 1726883066.72132: waiting for pending results... 33277 1726883066.72502: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 33277 1726883066.72624: in run() - task 0affc7ec-ae25-6628-6da4-00000000010a 33277 1726883066.72838: variable 'ansible_search_path' from source: unknown 33277 1726883066.72849: variable 'ansible_search_path' from source: unknown 33277 1726883066.73073: calling self._execute() 33277 1726883066.73468: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.73474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.73476: variable 'omit' from source: magic vars 33277 1726883066.74196: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.74282: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.74546: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.74558: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.74722: when evaluation is False, skipping this task 33277 1726883066.74727: _execute() done 33277 1726883066.74730: dumping result to json 33277 1726883066.74733: done dumping result, returning 33277 1726883066.74735: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affc7ec-ae25-6628-6da4-00000000010a] 33277 1726883066.74738: sending task result for task 0affc7ec-ae25-6628-6da4-00000000010a 33277 1726883066.74960: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000010a 33277 1726883066.74963: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883066.75016: no more pending results, returning what we have 33277 1726883066.75020: results queue empty 33277 1726883066.75023: checking for any_errors_fatal 33277 1726883066.75036: done checking for any_errors_fatal 33277 1726883066.75037: checking for max_fail_percentage 33277 1726883066.75039: done checking for max_fail_percentage 33277 1726883066.75040: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.75045: done checking to see if all hosts have failed 33277 1726883066.75046: getting the remaining hosts for this loop 33277 1726883066.75048: done getting the remaining hosts for this loop 33277 1726883066.75053: getting the next task for host managed_node2 33277 1726883066.75060: done getting next task for host managed_node2 33277 1726883066.75182: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 33277 1726883066.75189: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883066.75210: getting variables 33277 1726883066.75212: in VariableManager get_vars() 33277 1726883066.75454: Calling all_inventory to load vars for managed_node2 33277 1726883066.75457: Calling groups_inventory to load vars for managed_node2 33277 1726883066.75459: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.75470: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.75473: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.75476: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.75972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.76561: done with get_vars() 33277 1726883066.76571: done getting variables 33277 1726883066.76726: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:44:26 -0400 (0:00:00.054) 0:00:09.956 ****** 33277 1726883066.76820: entering _queue_task() for managed_node2/service 33277 1726883066.77477: worker is 1 (out of 1 available) 33277 1726883066.77494: exiting _queue_task() for managed_node2/service 33277 1726883066.77507: done queuing things up, now waiting for results queue to drain 33277 1726883066.77508: waiting for pending results... 33277 1726883066.78240: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 33277 1726883066.78628: in run() - task 0affc7ec-ae25-6628-6da4-00000000010b 33277 1726883066.78632: variable 'ansible_search_path' from source: unknown 33277 1726883066.78635: variable 'ansible_search_path' from source: unknown 33277 1726883066.78638: calling self._execute() 33277 1726883066.78641: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.78644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.78646: variable 'omit' from source: magic vars 33277 1726883066.79397: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.79444: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.79751: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.79763: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.79770: when evaluation is False, skipping this task 33277 1726883066.79777: _execute() done 33277 1726883066.79785: dumping result to json 33277 1726883066.79793: done dumping result, returning 33277 1726883066.79805: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affc7ec-ae25-6628-6da4-00000000010b] 33277 1726883066.79815: sending task result for task 0affc7ec-ae25-6628-6da4-00000000010b skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 33277 1726883066.79965: no more pending results, returning what we have 33277 1726883066.79970: results queue empty 33277 1726883066.79971: checking for any_errors_fatal 33277 1726883066.79976: done checking for any_errors_fatal 33277 1726883066.79977: checking for max_fail_percentage 33277 1726883066.79979: done checking for max_fail_percentage 33277 1726883066.79980: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.79980: done checking to see if all hosts have failed 33277 1726883066.79981: getting the remaining hosts for this loop 33277 1726883066.79983: done getting the remaining hosts for this loop 33277 1726883066.79990: getting the next task for host managed_node2 33277 1726883066.79998: done getting next task for host managed_node2 33277 1726883066.80003: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 33277 1726883066.80008: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883066.80031: getting variables 33277 1726883066.80034: in VariableManager get_vars() 33277 1726883066.80090: Calling all_inventory to load vars for managed_node2 33277 1726883066.80094: Calling groups_inventory to load vars for managed_node2 33277 1726883066.80096: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.80110: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.80113: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.80116: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.80972: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000010b 33277 1726883066.80976: WORKER PROCESS EXITING 33277 1726883066.81093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.81668: done with get_vars() 33277 1726883066.81678: done getting variables 33277 1726883066.81848: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:44:26 -0400 (0:00:00.051) 0:00:10.008 ****** 33277 1726883066.82001: entering _queue_task() for managed_node2/copy 33277 1726883066.82546: worker is 1 (out of 1 available) 33277 1726883066.82561: exiting _queue_task() for managed_node2/copy 33277 1726883066.82571: done queuing things up, now waiting for results queue to drain 33277 1726883066.82573: waiting for pending results... 33277 1726883066.83242: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 33277 1726883066.83248: in run() - task 0affc7ec-ae25-6628-6da4-00000000010c 33277 1726883066.83251: variable 'ansible_search_path' from source: unknown 33277 1726883066.83254: variable 'ansible_search_path' from source: unknown 33277 1726883066.83528: calling self._execute() 33277 1726883066.83560: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.83827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.83831: variable 'omit' from source: magic vars 33277 1726883066.84361: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.84544: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.85028: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.85031: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.85034: when evaluation is False, skipping this task 33277 1726883066.85038: _execute() done 33277 1726883066.85042: dumping result to json 33277 1726883066.85045: done dumping result, returning 33277 1726883066.85051: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affc7ec-ae25-6628-6da4-00000000010c] 33277 1726883066.85054: sending task result for task 0affc7ec-ae25-6628-6da4-00000000010c skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883066.85193: no more pending results, returning what we have 33277 1726883066.85198: results queue empty 33277 1726883066.85200: checking for any_errors_fatal 33277 1726883066.85206: done checking for any_errors_fatal 33277 1726883066.85206: checking for max_fail_percentage 33277 1726883066.85208: done checking for max_fail_percentage 33277 1726883066.85209: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.85209: done checking to see if all hosts have failed 33277 1726883066.85210: getting the remaining hosts for this loop 33277 1726883066.85211: done getting the remaining hosts for this loop 33277 1726883066.85216: getting the next task for host managed_node2 33277 1726883066.85224: done getting next task for host managed_node2 33277 1726883066.85228: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 33277 1726883066.85233: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883066.85259: getting variables 33277 1726883066.85261: in VariableManager get_vars() 33277 1726883066.85312: Calling all_inventory to load vars for managed_node2 33277 1726883066.85315: Calling groups_inventory to load vars for managed_node2 33277 1726883066.85317: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.85441: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.85445: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.85451: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.85866: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000010c 33277 1726883066.85871: WORKER PROCESS EXITING 33277 1726883066.85895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.86402: done with get_vars() 33277 1726883066.86413: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:44:26 -0400 (0:00:00.047) 0:00:10.055 ****** 33277 1726883066.86710: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 33277 1726883066.87100: worker is 1 (out of 1 available) 33277 1726883066.87115: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 33277 1726883066.87130: done queuing things up, now waiting for results queue to drain 33277 1726883066.87132: waiting for pending results... 33277 1726883066.87395: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 33277 1726883066.87546: in run() - task 0affc7ec-ae25-6628-6da4-00000000010d 33277 1726883066.87560: variable 'ansible_search_path' from source: unknown 33277 1726883066.87564: variable 'ansible_search_path' from source: unknown 33277 1726883066.87602: calling self._execute() 33277 1726883066.87705: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.87712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.87721: variable 'omit' from source: magic vars 33277 1726883066.88180: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.88203: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.88330: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.88337: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.88340: when evaluation is False, skipping this task 33277 1726883066.88345: _execute() done 33277 1726883066.88348: dumping result to json 33277 1726883066.88350: done dumping result, returning 33277 1726883066.88353: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affc7ec-ae25-6628-6da4-00000000010d] 33277 1726883066.88371: sending task result for task 0affc7ec-ae25-6628-6da4-00000000010d 33277 1726883066.88474: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000010d 33277 1726883066.88477: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883066.88533: no more pending results, returning what we have 33277 1726883066.88537: results queue empty 33277 1726883066.88538: checking for any_errors_fatal 33277 1726883066.88544: done checking for any_errors_fatal 33277 1726883066.88545: checking for max_fail_percentage 33277 1726883066.88546: done checking for max_fail_percentage 33277 1726883066.88547: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.88547: done checking to see if all hosts have failed 33277 1726883066.88548: getting the remaining hosts for this loop 33277 1726883066.88549: done getting the remaining hosts for this loop 33277 1726883066.88553: getting the next task for host managed_node2 33277 1726883066.88560: done getting next task for host managed_node2 33277 1726883066.88564: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 33277 1726883066.88568: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883066.88586: getting variables 33277 1726883066.88588: in VariableManager get_vars() 33277 1726883066.88631: Calling all_inventory to load vars for managed_node2 33277 1726883066.88634: Calling groups_inventory to load vars for managed_node2 33277 1726883066.88636: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.88644: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.88647: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.88649: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.89700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.90364: done with get_vars() 33277 1726883066.90381: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:44:26 -0400 (0:00:00.037) 0:00:10.093 ****** 33277 1726883066.90534: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 33277 1726883066.91064: worker is 1 (out of 1 available) 33277 1726883066.91078: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 33277 1726883066.91093: done queuing things up, now waiting for results queue to drain 33277 1726883066.91094: waiting for pending results... 33277 1726883066.91795: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 33277 1726883066.91800: in run() - task 0affc7ec-ae25-6628-6da4-00000000010e 33277 1726883066.91803: variable 'ansible_search_path' from source: unknown 33277 1726883066.91806: variable 'ansible_search_path' from source: unknown 33277 1726883066.91833: calling self._execute() 33277 1726883066.91972: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.91976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.92127: variable 'omit' from source: magic vars 33277 1726883066.92474: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.92497: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.92578: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.92585: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.92594: when evaluation is False, skipping this task 33277 1726883066.92600: _execute() done 33277 1726883066.92603: dumping result to json 33277 1726883066.92606: done dumping result, returning 33277 1726883066.92609: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affc7ec-ae25-6628-6da4-00000000010e] 33277 1726883066.92612: sending task result for task 0affc7ec-ae25-6628-6da4-00000000010e skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883066.92766: no more pending results, returning what we have 33277 1726883066.92770: results queue empty 33277 1726883066.92771: checking for any_errors_fatal 33277 1726883066.92776: done checking for any_errors_fatal 33277 1726883066.92777: checking for max_fail_percentage 33277 1726883066.92779: done checking for max_fail_percentage 33277 1726883066.92779: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.92781: done checking to see if all hosts have failed 33277 1726883066.92782: getting the remaining hosts for this loop 33277 1726883066.92783: done getting the remaining hosts for this loop 33277 1726883066.92787: getting the next task for host managed_node2 33277 1726883066.92794: done getting next task for host managed_node2 33277 1726883066.92798: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 33277 1726883066.92802: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883066.92837: getting variables 33277 1726883066.92838: in VariableManager get_vars() 33277 1726883066.92880: Calling all_inventory to load vars for managed_node2 33277 1726883066.92883: Calling groups_inventory to load vars for managed_node2 33277 1726883066.92885: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.92894: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.92896: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.92899: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.93037: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000010e 33277 1726883066.93046: WORKER PROCESS EXITING 33277 1726883066.93056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.93303: done with get_vars() 33277 1726883066.93315: done getting variables 33277 1726883066.93378: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:44:26 -0400 (0:00:00.028) 0:00:10.122 ****** 33277 1726883066.93412: entering _queue_task() for managed_node2/debug 33277 1726883066.93685: worker is 1 (out of 1 available) 33277 1726883066.93697: exiting _queue_task() for managed_node2/debug 33277 1726883066.93710: done queuing things up, now waiting for results queue to drain 33277 1726883066.93711: waiting for pending results... 33277 1726883066.94142: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 33277 1726883066.94181: in run() - task 0affc7ec-ae25-6628-6da4-00000000010f 33277 1726883066.94206: variable 'ansible_search_path' from source: unknown 33277 1726883066.94210: variable 'ansible_search_path' from source: unknown 33277 1726883066.94255: calling self._execute() 33277 1726883066.94531: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.94535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.94538: variable 'omit' from source: magic vars 33277 1726883066.95607: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.95620: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.96130: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.96134: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.96136: when evaluation is False, skipping this task 33277 1726883066.96139: _execute() done 33277 1726883066.96142: dumping result to json 33277 1726883066.96144: done dumping result, returning 33277 1726883066.96147: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affc7ec-ae25-6628-6da4-00000000010f] 33277 1726883066.96149: sending task result for task 0affc7ec-ae25-6628-6da4-00000000010f skipping: [managed_node2] => { "false_condition": "ansible_distribution_major_version == '7'" } 33277 1726883066.96282: no more pending results, returning what we have 33277 1726883066.96287: results queue empty 33277 1726883066.96288: checking for any_errors_fatal 33277 1726883066.96295: done checking for any_errors_fatal 33277 1726883066.96301: checking for max_fail_percentage 33277 1726883066.96303: done checking for max_fail_percentage 33277 1726883066.96304: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.96305: done checking to see if all hosts have failed 33277 1726883066.96306: getting the remaining hosts for this loop 33277 1726883066.96307: done getting the remaining hosts for this loop 33277 1726883066.96313: getting the next task for host managed_node2 33277 1726883066.96320: done getting next task for host managed_node2 33277 1726883066.96327: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 33277 1726883066.96332: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883066.96348: done sending task result for task 0affc7ec-ae25-6628-6da4-00000000010f 33277 1726883066.96353: WORKER PROCESS EXITING 33277 1726883066.96365: getting variables 33277 1726883066.96368: in VariableManager get_vars() 33277 1726883066.96539: Calling all_inventory to load vars for managed_node2 33277 1726883066.96543: Calling groups_inventory to load vars for managed_node2 33277 1726883066.96545: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.96558: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.96561: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.96564: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.97035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.97306: done with get_vars() 33277 1726883066.97319: done getting variables 33277 1726883066.97406: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:44:26 -0400 (0:00:00.040) 0:00:10.162 ****** 33277 1726883066.97444: entering _queue_task() for managed_node2/debug 33277 1726883066.97769: worker is 1 (out of 1 available) 33277 1726883066.97782: exiting _queue_task() for managed_node2/debug 33277 1726883066.97798: done queuing things up, now waiting for results queue to drain 33277 1726883066.97800: waiting for pending results... 33277 1726883066.98041: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 33277 1726883066.98229: in run() - task 0affc7ec-ae25-6628-6da4-000000000110 33277 1726883066.98234: variable 'ansible_search_path' from source: unknown 33277 1726883066.98237: variable 'ansible_search_path' from source: unknown 33277 1726883066.98240: calling self._execute() 33277 1726883066.98308: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883066.98314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883066.98327: variable 'omit' from source: magic vars 33277 1726883066.98723: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.98735: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883066.98863: variable 'ansible_distribution_major_version' from source: facts 33277 1726883066.98867: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883066.98869: when evaluation is False, skipping this task 33277 1726883066.98873: _execute() done 33277 1726883066.98875: dumping result to json 33277 1726883066.98905: done dumping result, returning 33277 1726883066.98908: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affc7ec-ae25-6628-6da4-000000000110] 33277 1726883066.98911: sending task result for task 0affc7ec-ae25-6628-6da4-000000000110 33277 1726883066.99091: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000110 33277 1726883066.99095: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "ansible_distribution_major_version == '7'" } 33277 1726883066.99156: no more pending results, returning what we have 33277 1726883066.99158: results queue empty 33277 1726883066.99159: checking for any_errors_fatal 33277 1726883066.99164: done checking for any_errors_fatal 33277 1726883066.99165: checking for max_fail_percentage 33277 1726883066.99166: done checking for max_fail_percentage 33277 1726883066.99167: checking to see if all hosts have failed and the running result is not ok 33277 1726883066.99168: done checking to see if all hosts have failed 33277 1726883066.99169: getting the remaining hosts for this loop 33277 1726883066.99170: done getting the remaining hosts for this loop 33277 1726883066.99173: getting the next task for host managed_node2 33277 1726883066.99179: done getting next task for host managed_node2 33277 1726883066.99182: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 33277 1726883066.99189: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883066.99211: getting variables 33277 1726883066.99213: in VariableManager get_vars() 33277 1726883066.99262: Calling all_inventory to load vars for managed_node2 33277 1726883066.99265: Calling groups_inventory to load vars for managed_node2 33277 1726883066.99268: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883066.99277: Calling all_plugins_play to load vars for managed_node2 33277 1726883066.99279: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883066.99282: Calling groups_plugins_play to load vars for managed_node2 33277 1726883066.99512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883066.99814: done with get_vars() 33277 1726883066.99827: done getting variables 33277 1726883066.99894: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:44:26 -0400 (0:00:00.024) 0:00:10.187 ****** 33277 1726883066.99931: entering _queue_task() for managed_node2/debug 33277 1726883067.00441: worker is 1 (out of 1 available) 33277 1726883067.00448: exiting _queue_task() for managed_node2/debug 33277 1726883067.00459: done queuing things up, now waiting for results queue to drain 33277 1726883067.00460: waiting for pending results... 33277 1726883067.00593: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 33277 1726883067.00700: in run() - task 0affc7ec-ae25-6628-6da4-000000000111 33277 1726883067.00723: variable 'ansible_search_path' from source: unknown 33277 1726883067.00731: variable 'ansible_search_path' from source: unknown 33277 1726883067.00774: calling self._execute() 33277 1726883067.00984: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883067.01001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883067.01018: variable 'omit' from source: magic vars 33277 1726883067.01897: variable 'ansible_distribution_major_version' from source: facts 33277 1726883067.01994: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883067.02229: variable 'ansible_distribution_major_version' from source: facts 33277 1726883067.02240: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883067.02247: when evaluation is False, skipping this task 33277 1726883067.02254: _execute() done 33277 1726883067.02261: dumping result to json 33277 1726883067.02277: done dumping result, returning 33277 1726883067.02328: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affc7ec-ae25-6628-6da4-000000000111] 33277 1726883067.02336: sending task result for task 0affc7ec-ae25-6628-6da4-000000000111 33277 1726883067.02729: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000111 33277 1726883067.02732: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "ansible_distribution_major_version == '7'" } 33277 1726883067.02776: no more pending results, returning what we have 33277 1726883067.02780: results queue empty 33277 1726883067.02781: checking for any_errors_fatal 33277 1726883067.02789: done checking for any_errors_fatal 33277 1726883067.02790: checking for max_fail_percentage 33277 1726883067.02792: done checking for max_fail_percentage 33277 1726883067.02793: checking to see if all hosts have failed and the running result is not ok 33277 1726883067.02794: done checking to see if all hosts have failed 33277 1726883067.02795: getting the remaining hosts for this loop 33277 1726883067.02796: done getting the remaining hosts for this loop 33277 1726883067.02800: getting the next task for host managed_node2 33277 1726883067.02808: done getting next task for host managed_node2 33277 1726883067.02813: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 33277 1726883067.02817: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883067.02841: getting variables 33277 1726883067.02844: in VariableManager get_vars() 33277 1726883067.02894: Calling all_inventory to load vars for managed_node2 33277 1726883067.02897: Calling groups_inventory to load vars for managed_node2 33277 1726883067.02901: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883067.02911: Calling all_plugins_play to load vars for managed_node2 33277 1726883067.02914: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883067.02917: Calling groups_plugins_play to load vars for managed_node2 33277 1726883067.03241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883067.03523: done with get_vars() 33277 1726883067.03546: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:44:27 -0400 (0:00:00.037) 0:00:10.225 ****** 33277 1726883067.03675: entering _queue_task() for managed_node2/ping 33277 1726883067.04007: worker is 1 (out of 1 available) 33277 1726883067.04133: exiting _queue_task() for managed_node2/ping 33277 1726883067.04145: done queuing things up, now waiting for results queue to drain 33277 1726883067.04147: waiting for pending results... 33277 1726883067.04365: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 33277 1726883067.04531: in run() - task 0affc7ec-ae25-6628-6da4-000000000112 33277 1726883067.04555: variable 'ansible_search_path' from source: unknown 33277 1726883067.04569: variable 'ansible_search_path' from source: unknown 33277 1726883067.04621: calling self._execute() 33277 1726883067.04761: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883067.04808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883067.04836: variable 'omit' from source: magic vars 33277 1726883067.05760: variable 'ansible_distribution_major_version' from source: facts 33277 1726883067.05763: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883067.05913: variable 'ansible_distribution_major_version' from source: facts 33277 1726883067.05985: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883067.05995: when evaluation is False, skipping this task 33277 1726883067.06001: _execute() done 33277 1726883067.06196: dumping result to json 33277 1726883067.06199: done dumping result, returning 33277 1726883067.06202: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affc7ec-ae25-6628-6da4-000000000112] 33277 1726883067.06204: sending task result for task 0affc7ec-ae25-6628-6da4-000000000112 33277 1726883067.06279: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000112 33277 1726883067.06282: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883067.06354: no more pending results, returning what we have 33277 1726883067.06358: results queue empty 33277 1726883067.06359: checking for any_errors_fatal 33277 1726883067.06368: done checking for any_errors_fatal 33277 1726883067.06369: checking for max_fail_percentage 33277 1726883067.06370: done checking for max_fail_percentage 33277 1726883067.06372: checking to see if all hosts have failed and the running result is not ok 33277 1726883067.06372: done checking to see if all hosts have failed 33277 1726883067.06373: getting the remaining hosts for this loop 33277 1726883067.06375: done getting the remaining hosts for this loop 33277 1726883067.06380: getting the next task for host managed_node2 33277 1726883067.06393: done getting next task for host managed_node2 33277 1726883067.06396: ^ task is: TASK: meta (role_complete) 33277 1726883067.06401: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883067.06433: getting variables 33277 1726883067.06435: in VariableManager get_vars() 33277 1726883067.06494: Calling all_inventory to load vars for managed_node2 33277 1726883067.06498: Calling groups_inventory to load vars for managed_node2 33277 1726883067.06500: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883067.06739: Calling all_plugins_play to load vars for managed_node2 33277 1726883067.06743: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883067.06748: Calling groups_plugins_play to load vars for managed_node2 33277 1726883067.07563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883067.08097: done with get_vars() 33277 1726883067.08115: done getting variables 33277 1726883067.08271: done queuing things up, now waiting for results queue to drain 33277 1726883067.08273: results queue empty 33277 1726883067.08274: checking for any_errors_fatal 33277 1726883067.08277: done checking for any_errors_fatal 33277 1726883067.08278: checking for max_fail_percentage 33277 1726883067.08279: done checking for max_fail_percentage 33277 1726883067.08280: checking to see if all hosts have failed and the running result is not ok 33277 1726883067.08280: done checking to see if all hosts have failed 33277 1726883067.08281: getting the remaining hosts for this loop 33277 1726883067.08282: done getting the remaining hosts for this loop 33277 1726883067.08288: getting the next task for host managed_node2 33277 1726883067.08324: done getting next task for host managed_node2 33277 1726883067.08327: ^ task is: TASK: Include the task 'cleanup_mock_wifi.yml' 33277 1726883067.08329: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883067.08332: getting variables 33277 1726883067.08334: in VariableManager get_vars() 33277 1726883067.08353: Calling all_inventory to load vars for managed_node2 33277 1726883067.08355: Calling groups_inventory to load vars for managed_node2 33277 1726883067.08357: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883067.08362: Calling all_plugins_play to load vars for managed_node2 33277 1726883067.08365: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883067.08368: Calling groups_plugins_play to load vars for managed_node2 33277 1726883067.08669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883067.09210: done with get_vars() 33277 1726883067.09220: done getting variables TASK [Include the task 'cleanup_mock_wifi.yml'] ******************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:96 Friday 20 September 2024 21:44:27 -0400 (0:00:00.056) 0:00:10.281 ****** 33277 1726883067.09344: entering _queue_task() for managed_node2/include_tasks 33277 1726883067.10016: worker is 1 (out of 1 available) 33277 1726883067.10160: exiting _queue_task() for managed_node2/include_tasks 33277 1726883067.10173: done queuing things up, now waiting for results queue to drain 33277 1726883067.10174: waiting for pending results... 33277 1726883067.10480: running TaskExecutor() for managed_node2/TASK: Include the task 'cleanup_mock_wifi.yml' 33277 1726883067.10694: in run() - task 0affc7ec-ae25-6628-6da4-000000000142 33277 1726883067.10698: variable 'ansible_search_path' from source: unknown 33277 1726883067.10708: calling self._execute() 33277 1726883067.10775: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883067.10792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883067.10812: variable 'omit' from source: magic vars 33277 1726883067.11249: variable 'ansible_distribution_major_version' from source: facts 33277 1726883067.11273: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883067.11412: variable 'ansible_distribution_major_version' from source: facts 33277 1726883067.11425: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883067.11457: when evaluation is False, skipping this task 33277 1726883067.11460: _execute() done 33277 1726883067.11462: dumping result to json 33277 1726883067.11465: done dumping result, returning 33277 1726883067.11468: done running TaskExecutor() for managed_node2/TASK: Include the task 'cleanup_mock_wifi.yml' [0affc7ec-ae25-6628-6da4-000000000142] 33277 1726883067.11527: sending task result for task 0affc7ec-ae25-6628-6da4-000000000142 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883067.11726: no more pending results, returning what we have 33277 1726883067.11731: results queue empty 33277 1726883067.11732: checking for any_errors_fatal 33277 1726883067.11735: done checking for any_errors_fatal 33277 1726883067.11735: checking for max_fail_percentage 33277 1726883067.11737: done checking for max_fail_percentage 33277 1726883067.11738: checking to see if all hosts have failed and the running result is not ok 33277 1726883067.11739: done checking to see if all hosts have failed 33277 1726883067.11740: getting the remaining hosts for this loop 33277 1726883067.11741: done getting the remaining hosts for this loop 33277 1726883067.11746: getting the next task for host managed_node2 33277 1726883067.11754: done getting next task for host managed_node2 33277 1726883067.11757: ^ task is: TASK: Verify network state restored to default 33277 1726883067.11760: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 33277 1726883067.11765: getting variables 33277 1726883067.11766: in VariableManager get_vars() 33277 1726883067.11829: Calling all_inventory to load vars for managed_node2 33277 1726883067.11832: Calling groups_inventory to load vars for managed_node2 33277 1726883067.11834: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883067.11849: Calling all_plugins_play to load vars for managed_node2 33277 1726883067.11851: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883067.11854: Calling groups_plugins_play to load vars for managed_node2 33277 1726883067.12332: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000142 33277 1726883067.12336: WORKER PROCESS EXITING 33277 1726883067.12364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883067.12623: done with get_vars() 33277 1726883067.12634: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:98 Friday 20 September 2024 21:44:27 -0400 (0:00:00.033) 0:00:10.315 ****** 33277 1726883067.12737: entering _queue_task() for managed_node2/include_tasks 33277 1726883067.13143: worker is 1 (out of 1 available) 33277 1726883067.13154: exiting _queue_task() for managed_node2/include_tasks 33277 1726883067.13165: done queuing things up, now waiting for results queue to drain 33277 1726883067.13167: waiting for pending results... 33277 1726883067.13582: running TaskExecutor() for managed_node2/TASK: Verify network state restored to default 33277 1726883067.13785: in run() - task 0affc7ec-ae25-6628-6da4-000000000143 33277 1726883067.13798: variable 'ansible_search_path' from source: unknown 33277 1726883067.13945: calling self._execute() 33277 1726883067.14141: variable 'ansible_host' from source: host vars for 'managed_node2' 33277 1726883067.14152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 33277 1726883067.14163: variable 'omit' from source: magic vars 33277 1726883067.14975: variable 'ansible_distribution_major_version' from source: facts 33277 1726883067.14991: Evaluated conditional (ansible_distribution_major_version != '6'): True 33277 1726883067.15183: variable 'ansible_distribution_major_version' from source: facts 33277 1726883067.15189: Evaluated conditional (ansible_distribution_major_version == '7'): False 33277 1726883067.15192: when evaluation is False, skipping this task 33277 1726883067.15196: _execute() done 33277 1726883067.15203: dumping result to json 33277 1726883067.15205: done dumping result, returning 33277 1726883067.15208: done running TaskExecutor() for managed_node2/TASK: Verify network state restored to default [0affc7ec-ae25-6628-6da4-000000000143] 33277 1726883067.15209: sending task result for task 0affc7ec-ae25-6628-6da4-000000000143 33277 1726883067.15283: done sending task result for task 0affc7ec-ae25-6628-6da4-000000000143 33277 1726883067.15290: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 33277 1726883067.15464: no more pending results, returning what we have 33277 1726883067.15467: results queue empty 33277 1726883067.15468: checking for any_errors_fatal 33277 1726883067.15473: done checking for any_errors_fatal 33277 1726883067.15474: checking for max_fail_percentage 33277 1726883067.15476: done checking for max_fail_percentage 33277 1726883067.15477: checking to see if all hosts have failed and the running result is not ok 33277 1726883067.15478: done checking to see if all hosts have failed 33277 1726883067.15479: getting the remaining hosts for this loop 33277 1726883067.15480: done getting the remaining hosts for this loop 33277 1726883067.15484: getting the next task for host managed_node2 33277 1726883067.15494: done getting next task for host managed_node2 33277 1726883067.15497: ^ task is: TASK: meta (flush_handlers) 33277 1726883067.15499: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883067.15504: getting variables 33277 1726883067.15505: in VariableManager get_vars() 33277 1726883067.15554: Calling all_inventory to load vars for managed_node2 33277 1726883067.15557: Calling groups_inventory to load vars for managed_node2 33277 1726883067.15560: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883067.15570: Calling all_plugins_play to load vars for managed_node2 33277 1726883067.15573: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883067.15577: Calling groups_plugins_play to load vars for managed_node2 33277 1726883067.15866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883067.16142: done with get_vars() 33277 1726883067.16159: done getting variables 33277 1726883067.16241: in VariableManager get_vars() 33277 1726883067.16266: Calling all_inventory to load vars for managed_node2 33277 1726883067.16269: Calling groups_inventory to load vars for managed_node2 33277 1726883067.16271: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883067.16277: Calling all_plugins_play to load vars for managed_node2 33277 1726883067.16279: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883067.16283: Calling groups_plugins_play to load vars for managed_node2 33277 1726883067.16509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883067.16877: done with get_vars() 33277 1726883067.16892: done queuing things up, now waiting for results queue to drain 33277 1726883067.16894: results queue empty 33277 1726883067.16895: checking for any_errors_fatal 33277 1726883067.16897: done checking for any_errors_fatal 33277 1726883067.16898: checking for max_fail_percentage 33277 1726883067.16898: done checking for max_fail_percentage 33277 1726883067.16899: checking to see if all hosts have failed and the running result is not ok 33277 1726883067.16900: done checking to see if all hosts have failed 33277 1726883067.16900: getting the remaining hosts for this loop 33277 1726883067.16901: done getting the remaining hosts for this loop 33277 1726883067.16904: getting the next task for host managed_node2 33277 1726883067.16907: done getting next task for host managed_node2 33277 1726883067.16909: ^ task is: TASK: meta (flush_handlers) 33277 1726883067.16910: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883067.16912: getting variables 33277 1726883067.16913: in VariableManager get_vars() 33277 1726883067.16934: Calling all_inventory to load vars for managed_node2 33277 1726883067.16936: Calling groups_inventory to load vars for managed_node2 33277 1726883067.16938: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883067.16943: Calling all_plugins_play to load vars for managed_node2 33277 1726883067.16945: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883067.16947: Calling groups_plugins_play to load vars for managed_node2 33277 1726883067.17108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883067.17339: done with get_vars() 33277 1726883067.17348: done getting variables 33277 1726883067.17401: in VariableManager get_vars() 33277 1726883067.17418: Calling all_inventory to load vars for managed_node2 33277 1726883067.17420: Calling groups_inventory to load vars for managed_node2 33277 1726883067.17424: Calling all_plugins_inventory to load vars for managed_node2 33277 1726883067.17429: Calling all_plugins_play to load vars for managed_node2 33277 1726883067.17431: Calling groups_plugins_inventory to load vars for managed_node2 33277 1726883067.17434: Calling groups_plugins_play to load vars for managed_node2 33277 1726883067.17613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 33277 1726883067.17881: done with get_vars() 33277 1726883067.17897: done queuing things up, now waiting for results queue to drain 33277 1726883067.17901: results queue empty 33277 1726883067.17902: checking for any_errors_fatal 33277 1726883067.17903: done checking for any_errors_fatal 33277 1726883067.17904: checking for max_fail_percentage 33277 1726883067.17905: done checking for max_fail_percentage 33277 1726883067.17906: checking to see if all hosts have failed and the running result is not ok 33277 1726883067.17907: done checking to see if all hosts have failed 33277 1726883067.17907: getting the remaining hosts for this loop 33277 1726883067.17908: done getting the remaining hosts for this loop 33277 1726883067.17916: getting the next task for host managed_node2 33277 1726883067.17919: done getting next task for host managed_node2 33277 1726883067.17920: ^ task is: None 33277 1726883067.17923: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 33277 1726883067.17924: done queuing things up, now waiting for results queue to drain 33277 1726883067.17925: results queue empty 33277 1726883067.17926: checking for any_errors_fatal 33277 1726883067.17927: done checking for any_errors_fatal 33277 1726883067.17928: checking for max_fail_percentage 33277 1726883067.17929: done checking for max_fail_percentage 33277 1726883067.17929: checking to see if all hosts have failed and the running result is not ok 33277 1726883067.17930: done checking to see if all hosts have failed 33277 1726883067.17933: getting the next task for host managed_node2 33277 1726883067.17935: done getting next task for host managed_node2 33277 1726883067.17936: ^ task is: None 33277 1726883067.17937: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=7 changed=0 unreachable=0 failed=0 skipped=102 rescued=0 ignored=0 Friday 20 September 2024 21:44:27 -0400 (0:00:00.052) 0:00:10.368 ****** =============================================================================== Gathering Facts --------------------------------------------------------- 3.06s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:6 Gather the minimum subset of ansible_facts required by the network role test --- 2.43s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.66s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Include the task 'enable_epel.yml' -------------------------------------- 0.15s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Copy client certs ------------------------------------------------------- 0.08s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:13 Set flag to indicate system is ostree ----------------------------------- 0.07s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Set network provider to 'nm' -------------------------------------------- 0.07s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:13 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.07s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Show debug messages for the network_state --- 0.07s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.06s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 fedora.linux_system_roles.network : Install packages -------------------- 0.06s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.06s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 --- 0.06s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 fedora.linux_system_roles.network : Show debug messages for the network_connections --- 0.06s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 fedora.linux_system_roles.network : Enable and start wpa_supplicant ----- 0.06s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 TEST: wireless connection with WPA-PSK ---------------------------------- 0.06s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:24 fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces --- 0.06s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces --- 0.06s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.06s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 fedora.linux_system_roles.network : Enable network service -------------- 0.06s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 33277 1726883067.18050: RUNNING CLEANUP