[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 12613 1727096135.88144: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-And executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 12613 1727096135.88446: Added group all to inventory 12613 1727096135.88448: Added group ungrouped to inventory 12613 1727096135.88453: Group all now contains ungrouped 12613 1727096135.88455: Examining possible inventory source: /tmp/network-EuO/inventory.yml 12613 1727096135.97754: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 12613 1727096135.97799: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 12613 1727096135.97817: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 12613 1727096135.97859: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 12613 1727096135.97909: Loaded config def from plugin (inventory/script) 12613 1727096135.97910: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 12613 1727096135.97939: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 12613 1727096135.98004: Loaded config def from plugin (inventory/yaml) 12613 1727096135.98006: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 12613 1727096135.98086: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 12613 1727096135.98367: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 12613 1727096135.98372: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 12613 1727096135.98374: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 12613 1727096135.98380: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 12613 1727096135.98384: Loading data from /tmp/network-EuO/inventory.yml 12613 1727096135.98425: /tmp/network-EuO/inventory.yml was not parsable by auto 12613 1727096135.98479: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 12613 1727096135.98518: Loading data from /tmp/network-EuO/inventory.yml 12613 1727096135.98574: group all already in inventory 12613 1727096135.98579: set inventory_file for managed_node1 12613 1727096135.98582: set inventory_dir for managed_node1 12613 1727096135.98583: Added host managed_node1 to inventory 12613 1727096135.98585: Added host managed_node1 to group all 12613 1727096135.98585: set ansible_host for managed_node1 12613 1727096135.98586: set ansible_ssh_extra_args for managed_node1 12613 1727096135.98588: set inventory_file for managed_node2 12613 1727096135.98589: set inventory_dir for managed_node2 12613 1727096135.98590: Added host managed_node2 to inventory 12613 1727096135.98591: Added host managed_node2 to group all 12613 1727096135.98591: set ansible_host for managed_node2 12613 1727096135.98592: set ansible_ssh_extra_args for managed_node2 12613 1727096135.98593: set inventory_file for managed_node3 12613 1727096135.98595: set inventory_dir for managed_node3 12613 1727096135.98595: Added host managed_node3 to inventory 12613 1727096135.98596: Added host managed_node3 to group all 12613 1727096135.98596: set ansible_host for managed_node3 12613 1727096135.98597: set ansible_ssh_extra_args for managed_node3 12613 1727096135.98599: Reconcile groups and hosts in inventory. 12613 1727096135.98601: Group ungrouped now contains managed_node1 12613 1727096135.98602: Group ungrouped now contains managed_node2 12613 1727096135.98603: Group ungrouped now contains managed_node3 12613 1727096135.98660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 12613 1727096135.98743: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 12613 1727096135.98775: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 12613 1727096135.98792: Loaded config def from plugin (vars/host_group_vars) 12613 1727096135.98794: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 12613 1727096135.98799: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 12613 1727096135.98804: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 12613 1727096135.98832: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 12613 1727096135.99079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096135.99143: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 12613 1727096135.99173: Loaded config def from plugin (connection/local) 12613 1727096135.99175: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 12613 1727096135.99673: Loaded config def from plugin (connection/paramiko_ssh) 12613 1727096135.99677: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 12613 1727096136.00548: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 12613 1727096136.00595: Loaded config def from plugin (connection/psrp) 12613 1727096136.00599: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 12613 1727096136.01036: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 12613 1727096136.01062: Loaded config def from plugin (connection/ssh) 12613 1727096136.01065: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 12613 1727096136.02405: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 12613 1727096136.02429: Loaded config def from plugin (connection/winrm) 12613 1727096136.02431: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 12613 1727096136.02455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 12613 1727096136.02531: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 12613 1727096136.02591: Loaded config def from plugin (shell/cmd) 12613 1727096136.02593: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 12613 1727096136.02619: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 12613 1727096136.02683: Loaded config def from plugin (shell/powershell) 12613 1727096136.02685: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 12613 1727096136.02743: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 12613 1727096136.02922: Loaded config def from plugin (shell/sh) 12613 1727096136.02924: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 12613 1727096136.02960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 12613 1727096136.03083: Loaded config def from plugin (become/runas) 12613 1727096136.03085: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 12613 1727096136.03236: Loaded config def from plugin (become/su) 12613 1727096136.03238: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 12613 1727096136.03378: Loaded config def from plugin (become/sudo) 12613 1727096136.03380: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 12613 1727096136.03414: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_initscripts.yml 12613 1727096136.03739: in VariableManager get_vars() 12613 1727096136.03760: done with get_vars() 12613 1727096136.03905: trying /usr/local/lib/python3.12/site-packages/ansible/modules 12613 1727096136.06893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 12613 1727096136.07007: in VariableManager get_vars() 12613 1727096136.07012: done with get_vars() 12613 1727096136.07015: variable 'playbook_dir' from source: magic vars 12613 1727096136.07016: variable 'ansible_playbook_python' from source: magic vars 12613 1727096136.07017: variable 'ansible_config_file' from source: magic vars 12613 1727096136.07018: variable 'groups' from source: magic vars 12613 1727096136.07018: variable 'omit' from source: magic vars 12613 1727096136.07019: variable 'ansible_version' from source: magic vars 12613 1727096136.07020: variable 'ansible_check_mode' from source: magic vars 12613 1727096136.07021: variable 'ansible_diff_mode' from source: magic vars 12613 1727096136.07021: variable 'ansible_forks' from source: magic vars 12613 1727096136.07022: variable 'ansible_inventory_sources' from source: magic vars 12613 1727096136.07023: variable 'ansible_skip_tags' from source: magic vars 12613 1727096136.07023: variable 'ansible_limit' from source: magic vars 12613 1727096136.07024: variable 'ansible_run_tags' from source: magic vars 12613 1727096136.07025: variable 'ansible_verbosity' from source: magic vars 12613 1727096136.07060: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml 12613 1727096136.08162: in VariableManager get_vars() 12613 1727096136.08182: done with get_vars() 12613 1727096136.08192: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 12613 1727096136.09099: in VariableManager get_vars() 12613 1727096136.09114: done with get_vars() 12613 1727096136.09122: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12613 1727096136.09227: in VariableManager get_vars() 12613 1727096136.09244: done with get_vars() 12613 1727096136.09386: in VariableManager get_vars() 12613 1727096136.09398: done with get_vars() 12613 1727096136.09407: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12613 1727096136.09482: in VariableManager get_vars() 12613 1727096136.09495: done with get_vars() 12613 1727096136.09778: in VariableManager get_vars() 12613 1727096136.09793: done with get_vars() 12613 1727096136.09799: variable 'omit' from source: magic vars 12613 1727096136.09819: variable 'omit' from source: magic vars 12613 1727096136.09854: in VariableManager get_vars() 12613 1727096136.09865: done with get_vars() 12613 1727096136.09914: in VariableManager get_vars() 12613 1727096136.09925: done with get_vars() 12613 1727096136.09959: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12613 1727096136.10184: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12613 1727096136.10315: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12613 1727096136.12942: in VariableManager get_vars() 12613 1727096136.12964: done with get_vars() 12613 1727096136.13378: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 12613 1727096136.13512: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12613 1727096136.15114: in VariableManager get_vars() 12613 1727096136.15137: done with get_vars() 12613 1727096136.15148: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12613 1727096136.15335: in VariableManager get_vars() 12613 1727096136.15354: done with get_vars() 12613 1727096136.15495: in VariableManager get_vars() 12613 1727096136.15523: done with get_vars() 12613 1727096136.15815: in VariableManager get_vars() 12613 1727096136.15833: done with get_vars() 12613 1727096136.15838: variable 'omit' from source: magic vars 12613 1727096136.15850: variable 'omit' from source: magic vars 12613 1727096136.16237: variable 'controller_profile' from source: play vars 12613 1727096136.16492: in VariableManager get_vars() 12613 1727096136.16508: done with get_vars() 12613 1727096136.16530: in VariableManager get_vars() 12613 1727096136.16546: done with get_vars() 12613 1727096136.16584: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12613 1727096136.17045: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12613 1727096136.17124: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12613 1727096136.17589: in VariableManager get_vars() 12613 1727096136.17610: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12613 1727096136.20988: in VariableManager get_vars() 12613 1727096136.21011: done with get_vars() 12613 1727096136.21016: variable 'omit' from source: magic vars 12613 1727096136.21028: variable 'omit' from source: magic vars 12613 1727096136.21060: in VariableManager get_vars() 12613 1727096136.21083: done with get_vars() 12613 1727096136.21103: in VariableManager get_vars() 12613 1727096136.21119: done with get_vars() 12613 1727096136.21148: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12613 1727096136.21262: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12613 1727096136.21350: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12613 1727096136.21782: in VariableManager get_vars() 12613 1727096136.21806: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12613 1727096136.24384: in VariableManager get_vars() 12613 1727096136.24410: done with get_vars() 12613 1727096136.24416: variable 'omit' from source: magic vars 12613 1727096136.24428: variable 'omit' from source: magic vars 12613 1727096136.24457: in VariableManager get_vars() 12613 1727096136.24497: done with get_vars() 12613 1727096136.24517: in VariableManager get_vars() 12613 1727096136.24535: done with get_vars() 12613 1727096136.24560: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12613 1727096136.24657: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12613 1727096136.24722: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12613 1727096136.25121: in VariableManager get_vars() 12613 1727096136.25147: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12613 1727096136.28455: in VariableManager get_vars() 12613 1727096136.28491: done with get_vars() 12613 1727096136.28496: variable 'omit' from source: magic vars 12613 1727096136.28522: variable 'omit' from source: magic vars 12613 1727096136.28582: in VariableManager get_vars() 12613 1727096136.28605: done with get_vars() 12613 1727096136.28624: in VariableManager get_vars() 12613 1727096136.28646: done with get_vars() 12613 1727096136.28684: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12613 1727096136.28821: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12613 1727096136.28903: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12613 1727096136.29406: in VariableManager get_vars() 12613 1727096136.29438: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12613 1727096136.32616: in VariableManager get_vars() 12613 1727096136.32654: done with get_vars() 12613 1727096136.32664: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 12613 1727096136.33169: in VariableManager get_vars() 12613 1727096136.33202: done with get_vars() 12613 1727096136.33262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 12613 1727096136.33278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 12613 1727096136.33532: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 12613 1727096136.33701: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 12613 1727096136.33704: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 12613 1727096136.33740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 12613 1727096136.33765: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 12613 1727096136.33969: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 12613 1727096136.34029: Loaded config def from plugin (callback/default) 12613 1727096136.34032: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12613 1727096136.35976: Loaded config def from plugin (callback/junit) 12613 1727096136.35981: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12613 1727096136.36033: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 12613 1727096136.36120: Loaded config def from plugin (callback/minimal) 12613 1727096136.36123: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12613 1727096136.36163: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12613 1727096136.36231: Loaded config def from plugin (callback/tree) 12613 1727096136.36234: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 12613 1727096136.36365: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 12613 1727096136.36369: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_removal_initscripts.yml *********************************** 2 plays in /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_initscripts.yml 12613 1727096136.36396: in VariableManager get_vars() 12613 1727096136.36411: done with get_vars() 12613 1727096136.36421: in VariableManager get_vars() 12613 1727096136.36435: done with get_vars() 12613 1727096136.36439: variable 'omit' from source: magic vars 12613 1727096136.36478: in VariableManager get_vars() 12613 1727096136.36491: done with get_vars() 12613 1727096136.36511: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_removal.yml' with initscripts as provider] *** 12613 1727096136.37133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 12613 1727096136.37215: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 12613 1727096136.37245: getting the remaining hosts for this loop 12613 1727096136.37246: done getting the remaining hosts for this loop 12613 1727096136.37249: getting the next task for host managed_node1 12613 1727096136.37253: done getting next task for host managed_node1 12613 1727096136.37255: ^ task is: TASK: Gathering Facts 12613 1727096136.37256: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096136.37259: getting variables 12613 1727096136.37260: in VariableManager get_vars() 12613 1727096136.37271: Calling all_inventory to load vars for managed_node1 12613 1727096136.37274: Calling groups_inventory to load vars for managed_node1 12613 1727096136.37283: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096136.37302: Calling all_plugins_play to load vars for managed_node1 12613 1727096136.37313: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096136.37317: Calling groups_plugins_play to load vars for managed_node1 12613 1727096136.37366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096136.37431: done with get_vars() 12613 1727096136.37440: done getting variables 12613 1727096136.37528: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_initscripts.yml:5 Monday 23 September 2024 08:55:36 -0400 (0:00:00.012) 0:00:00.012 ****** 12613 1727096136.37551: entering _queue_task() for managed_node1/gather_facts 12613 1727096136.37553: Creating lock for gather_facts 12613 1727096136.38109: worker is 1 (out of 1 available) 12613 1727096136.38122: exiting _queue_task() for managed_node1/gather_facts 12613 1727096136.38137: done queuing things up, now waiting for results queue to drain 12613 1727096136.38139: waiting for pending results... 12613 1727096136.38422: running TaskExecutor() for managed_node1/TASK: Gathering Facts 12613 1727096136.38607: in run() - task 0afff68d-5257-a9dd-d073-0000000001bc 12613 1727096136.38614: variable 'ansible_search_path' from source: unknown 12613 1727096136.38637: calling self._execute() 12613 1727096136.38725: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096136.38743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096136.38825: variable 'omit' from source: magic vars 12613 1727096136.38895: variable 'omit' from source: magic vars 12613 1727096136.38956: variable 'omit' from source: magic vars 12613 1727096136.38999: variable 'omit' from source: magic vars 12613 1727096136.39062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12613 1727096136.39119: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12613 1727096136.39158: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12613 1727096136.39184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12613 1727096136.39260: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12613 1727096136.39265: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12613 1727096136.39269: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096136.39272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096136.39355: Set connection var ansible_connection to ssh 12613 1727096136.39392: Set connection var ansible_module_compression to ZIP_DEFLATED 12613 1727096136.39408: Set connection var ansible_timeout to 10 12613 1727096136.39420: Set connection var ansible_shell_type to sh 12613 1727096136.39431: Set connection var ansible_pipelining to False 12613 1727096136.39442: Set connection var ansible_shell_executable to /bin/sh 12613 1727096136.39481: variable 'ansible_shell_executable' from source: unknown 12613 1727096136.39573: variable 'ansible_connection' from source: unknown 12613 1727096136.39581: variable 'ansible_module_compression' from source: unknown 12613 1727096136.39584: variable 'ansible_shell_type' from source: unknown 12613 1727096136.39586: variable 'ansible_shell_executable' from source: unknown 12613 1727096136.39589: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096136.39592: variable 'ansible_pipelining' from source: unknown 12613 1727096136.39594: variable 'ansible_timeout' from source: unknown 12613 1727096136.39596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096136.39773: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12613 1727096136.39777: variable 'omit' from source: magic vars 12613 1727096136.39779: starting attempt loop 12613 1727096136.39781: running the handler 12613 1727096136.39792: variable 'ansible_facts' from source: unknown 12613 1727096136.39828: _low_level_execute_command(): starting 12613 1727096136.39842: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12613 1727096136.40762: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12613 1727096136.40786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 12613 1727096136.40800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12613 1727096136.40824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12613 1727096136.41017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12613 1727096136.42638: stdout chunk (state=3): >>>/root <<< 12613 1727096136.42774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12613 1727096136.42795: stdout chunk (state=3): >>><<< 12613 1727096136.42810: stderr chunk (state=3): >>><<< 12613 1727096136.42837: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12613 1727096136.42941: _low_level_execute_command(): starting 12613 1727096136.42945: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096136.4284427-12640-68497737261898 `" && echo ansible-tmp-1727096136.4284427-12640-68497737261898="` echo /root/.ansible/tmp/ansible-tmp-1727096136.4284427-12640-68497737261898 `" ) && sleep 0' 12613 1727096136.43505: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12613 1727096136.43519: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12613 1727096136.43535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12613 1727096136.43554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12613 1727096136.43576: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 12613 1727096136.43623: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12613 1727096136.43696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 12613 1727096136.43733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12613 1727096136.43940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12613 1727096136.46049: stdout chunk (state=3): >>>ansible-tmp-1727096136.4284427-12640-68497737261898=/root/.ansible/tmp/ansible-tmp-1727096136.4284427-12640-68497737261898 <<< 12613 1727096136.46075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12613 1727096136.46219: stderr chunk (state=3): >>><<< 12613 1727096136.46223: stdout chunk (state=3): >>><<< 12613 1727096136.46240: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096136.4284427-12640-68497737261898=/root/.ansible/tmp/ansible-tmp-1727096136.4284427-12640-68497737261898 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12613 1727096136.46679: variable 'ansible_module_compression' from source: unknown 12613 1727096136.46682: ANSIBALLZ: Using generic lock for ansible.legacy.setup 12613 1727096136.46685: ANSIBALLZ: Acquiring lock 12613 1727096136.46687: ANSIBALLZ: Lock acquired: 140022615747488 12613 1727096136.46689: ANSIBALLZ: Creating module 12613 1727096137.12199: ANSIBALLZ: Writing module into payload 12613 1727096137.12757: ANSIBALLZ: Writing module 12613 1727096137.12761: ANSIBALLZ: Renaming module 12613 1727096137.12763: ANSIBALLZ: Done creating module 12613 1727096137.12765: variable 'ansible_facts' from source: unknown 12613 1727096137.12770: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12613 1727096137.12772: _low_level_execute_command(): starting 12613 1727096137.12775: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 12613 1727096137.13983: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12613 1727096137.13987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 12613 1727096137.14298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12613 1727096137.14420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12613 1727096137.16161: stdout chunk (state=3): >>>PLATFORM <<< 12613 1727096137.16320: stdout chunk (state=3): >>>Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 12613 1727096137.16430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12613 1727096137.16434: stdout chunk (state=3): >>><<< 12613 1727096137.16436: stderr chunk (state=3): >>><<< 12613 1727096137.16453: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12613 1727096137.16820 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 12613 1727096137.16824: _low_level_execute_command(): starting 12613 1727096137.16826: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 12613 1727096137.17006: Sending initial data 12613 1727096137.17077: Sent initial data (1181 bytes) 12613 1727096137.17941: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12613 1727096137.17956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12613 1727096137.18116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 12613 1727096137.18160: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12613 1727096137.18190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12613 1727096137.18294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12613 1727096137.21887: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 12613 1727096137.22431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12613 1727096137.22435: stdout chunk (state=3): >>><<< 12613 1727096137.22453: stderr chunk (state=3): >>><<< 12613 1727096137.22457: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12613 1727096137.22775: variable 'ansible_facts' from source: unknown 12613 1727096137.22779: variable 'ansible_facts' from source: unknown 12613 1727096137.22781: variable 'ansible_module_compression' from source: unknown 12613 1727096137.22783: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12613tatu8w7b/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12613 1727096137.22785: variable 'ansible_facts' from source: unknown 12613 1727096137.23123: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096136.4284427-12640-68497737261898/AnsiballZ_setup.py 12613 1727096137.23586: Sending initial data 12613 1727096137.23589: Sent initial data (153 bytes) 12613 1727096137.24222: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12613 1727096137.24257: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12613 1727096137.24276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12613 1727096137.24322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 12613 1727096137.24335: stderr chunk (state=3): >>>debug2: match found <<< 12613 1727096137.24349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12613 1727096137.24429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 12613 1727096137.24463: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12613 1727096137.24666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12613 1727096137.26353: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12613 1727096137.26411: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12613 1727096137.26476: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12613tatu8w7b/tmpccvxxl1i /root/.ansible/tmp/ansible-tmp-1727096136.4284427-12640-68497737261898/AnsiballZ_setup.py <<< 12613 1727096137.26481: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096136.4284427-12640-68497737261898/AnsiballZ_setup.py" <<< 12613 1727096137.26564: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-12613tatu8w7b/tmpccvxxl1i" to remote "/root/.ansible/tmp/ansible-tmp-1727096136.4284427-12640-68497737261898/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096136.4284427-12640-68497737261898/AnsiballZ_setup.py" <<< 12613 1727096137.29401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12613 1727096137.29464: stderr chunk (state=3): >>><<< 12613 1727096137.29477: stdout chunk (state=3): >>><<< 12613 1727096137.29642: done transferring module to remote 12613 1727096137.29644: _low_level_execute_command(): starting 12613 1727096137.29649: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096136.4284427-12640-68497737261898/ /root/.ansible/tmp/ansible-tmp-1727096136.4284427-12640-68497737261898/AnsiballZ_setup.py && sleep 0' 12613 1727096137.31129: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12613 1727096137.31439: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 12613 1727096137.31574: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12613 1727096137.31635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12613 1727096137.34109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12613 1727096137.34113: stdout chunk (state=3): >>><<< 12613 1727096137.34116: stderr chunk (state=3): >>><<< 12613 1727096137.34118: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12613 1727096137.34120: _low_level_execute_command(): starting 12613 1727096137.34122: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096136.4284427-12640-68497737261898/AnsiballZ_setup.py && sleep 0' 12613 1727096137.35302: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12613 1727096137.35479: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 12613 1727096137.35487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12613 1727096137.35594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12613 1727096137.37876: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 12613 1727096137.37918: stdout chunk (state=3): >>>import _imp # builtin <<< 12613 1727096137.37940: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 12613 1727096137.38141: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 12613 1727096137.38145: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 12613 1727096137.38147: stdout chunk (state=3): >>>import '_codecs' # <<< 12613 1727096137.38198: stdout chunk (state=3): >>>import 'codecs' # <<< 12613 1727096137.38206: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 12613 1727096137.38571: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d4184d0><<< 12613 1727096137.38782: stdout chunk (state=3): >>> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d3e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d41aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d22d130> <<< 12613 1727096137.39007: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d22dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 12613 1727096137.39133: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 12613 1727096137.39147: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 12613 1727096137.39162: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 12613 1727096137.39165: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 12613 1727096137.39189: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 12613 1727096137.39235: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 12613 1727096137.39239: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 12613 1727096137.39276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 12613 1727096137.39292: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d26be00> <<< 12613 1727096137.39593: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d26bec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2a37d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2a3e60> import '_collections' # <<< 12613 1727096137.39631: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d283ad0> <<< 12613 1727096137.39636: stdout chunk (state=3): >>>import '_functools' # <<< 12613 1727096137.39665: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2811f0> <<< 12613 1727096137.39779: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d268fb0> <<< 12613 1727096137.39786: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 12613 1727096137.39789: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 12613 1727096137.39791: stdout chunk (state=3): >>>import '_sre' # <<< 12613 1727096137.39813: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 12613 1727096137.39900: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 12613 1727096137.39903: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 12613 1727096137.39906: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 12613 1727096137.39908: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2c3770> <<< 12613 1727096137.40002: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2c2390> <<< 12613 1727096137.40072: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 12613 1727096137.40076: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d282090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2c0bc0> <<< 12613 1727096137.40080: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 12613 1727096137.40083: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2f8800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d268230> <<< 12613 1727096137.40431: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97d2f8cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2f8b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97d2f8ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d266d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 12613 1727096137.40436: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 12613 1727096137.40439: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2f9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2f9250> <<< 12613 1727096137.40441: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 12613 1727096137.40478: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2fa480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d3106b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97d311d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 12613 1727096137.40487: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 12613 1727096137.40503: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d312c30> <<< 12613 1727096137.40682: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97d313290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d312180> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97d313d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d313440> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2fa4e0> <<< 12613 1727096137.40688: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 12613 1727096137.40722: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 12613 1727096137.40727: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 12613 1727096137.40757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 12613 1727096137.40782: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096137.40793: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97d047bc0> <<< 12613 1727096137.41087: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97d0706e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d070440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97d070620> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096137.41091: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97d070fe0> <<< 12613 1727096137.41449: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97d071970> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d070890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d045d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d072cf0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d070e60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2fabd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 12613 1727096137.41590: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d09f020> <<< 12613 1727096137.41594: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 12613 1727096137.41597: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 12613 1727096137.41677: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 12613 1727096137.41680: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 12613 1727096137.41683: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d0c33e0> <<< 12613 1727096137.41702: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 12613 1727096137.41794: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 12613 1727096137.41823: stdout chunk (state=3): >>>import 'ntpath' # <<< 12613 1727096137.41831: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d1201a0> <<< 12613 1727096137.41864: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 12613 1727096137.41871: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 12613 1727096137.41903: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 12613 1727096137.42271: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d1228d0> <<< 12613 1727096137.42278: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d1202c0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d0ed190> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97cf251f0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d0c21e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d073bf0> <<< 12613 1727096137.42492: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc97d0c28a0> <<< 12613 1727096137.42645: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_25e1cyeb/ansible_ansible.legacy.setup_payload.zip' <<< 12613 1727096137.42653: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.42772: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.43081: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97cf86f00> import '_typing' # <<< 12613 1727096137.43143: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97cf65df0> <<< 12613 1727096137.43150: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97cf64f50> # zipimport: zlib available <<< 12613 1727096137.43181: stdout chunk (state=3): >>>import 'ansible' # <<< 12613 1727096137.43185: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.43213: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.43225: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.43237: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 12613 1727096137.43240: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.44776: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.45817: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 12613 1727096137.45821: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97cf84c50> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 12613 1727096137.45856: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 12613 1727096137.45860: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 12613 1727096137.45890: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 12613 1727096137.45899: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 12613 1727096137.45919: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096137.45986: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97cfbe7e0> <<< 12613 1727096137.46012: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97cfbe570> <<< 12613 1727096137.46015: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97cfbde80> <<< 12613 1727096137.46017: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 12613 1727096137.46115: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97cfbe5d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2829c0> <<< 12613 1727096137.46118: stdout chunk (state=3): >>>import 'atexit' # <<< 12613 1727096137.46121: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97cfbf4d0> <<< 12613 1727096137.46202: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97cfbf710> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 12613 1727096137.46206: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 12613 1727096137.46208: stdout chunk (state=3): >>>import '_locale' # <<< 12613 1727096137.46255: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97cfbfc50> <<< 12613 1727096137.46272: stdout chunk (state=3): >>>import 'pwd' # <<< 12613 1727096137.46280: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 12613 1727096137.46311: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 12613 1727096137.46397: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c9259d0> <<< 12613 1727096137.46401: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c9275f0> <<< 12613 1727096137.46500: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c927f80> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 12613 1727096137.46519: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c929160> <<< 12613 1727096137.46686: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 12613 1727096137.46693: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c92bb90> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c92bec0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c929e50> <<< 12613 1727096137.46732: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 12613 1727096137.46735: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 12613 1727096137.47042: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 12613 1727096137.47045: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c933a40> import '_tokenize' # <<< 12613 1727096137.47066: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c932510> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c932270> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 12613 1727096137.47120: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c932780> <<< 12613 1727096137.47147: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c92a360> <<< 12613 1727096137.47183: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c977c50> <<< 12613 1727096137.47275: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c977cb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 12613 1727096137.47302: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096137.47317: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c979820> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c9795e0> <<< 12613 1727096137.47323: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 12613 1727096137.47597: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c97bd70> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c979ee0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c97f4a0> <<< 12613 1727096137.47759: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c97be30> <<< 12613 1727096137.47763: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c980230> <<< 12613 1727096137.47836: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c980260> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c9805c0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c977fe0> <<< 12613 1727096137.47844: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 12613 1727096137.48048: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 12613 1727096137.48051: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c983ce0> <<< 12613 1727096137.48151: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c80cfb0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c982480> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c983800> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c982090> <<< 12613 1727096137.48186: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.48198: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 12613 1727096137.48201: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.48366: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.48581: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 12613 1727096137.48631: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.48673: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.49218: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.49769: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 12613 1727096137.49776: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 12613 1727096137.49809: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 12613 1727096137.49823: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 12613 1727096137.50084: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c811190> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c812030> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c80d100> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # <<< 12613 1727096137.50089: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.50270: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.50388: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 12613 1727096137.50392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c811f10> <<< 12613 1727096137.50413: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.51173: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.51449: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12613 1727096137.51460: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 12613 1727096137.51466: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.51505: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.51534: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 12613 1727096137.51775: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 12613 1727096137.51815: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 12613 1727096137.51831: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.52294: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 12613 1727096137.52344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 12613 1727096137.52352: stdout chunk (state=3): >>>import '_ast' # <<< 12613 1727096137.52426: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c813020> # zipimport: zlib available <<< 12613 1727096137.52510: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.52584: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 12613 1727096137.52588: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 12613 1727096137.52597: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 12613 1727096137.52613: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.52881: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 12613 1727096137.52918: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 12613 1727096137.53039: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 12613 1727096137.53054: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c81daf0> <<< 12613 1727096137.53383: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c81ac60> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 12613 1727096137.53441: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 12613 1727096137.53445: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 12613 1727096137.53473: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 12613 1727096137.53709: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c906510> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c9fe1e0> <<< 12613 1727096137.53733: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c81db80> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c9806e0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 12613 1727096137.53783: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 12613 1727096137.53796: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.53806: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 12613 1727096137.53809: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.53932: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.53948: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.53960: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.53976: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.54019: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.54153: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 12613 1727096137.54471: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 12613 1727096137.54575: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.54796: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 12613 1727096137.54818: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 12613 1727096137.54824: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 12613 1727096137.54919: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c8b18b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 12613 1727096137.55021: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 12613 1727096137.55030: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 12613 1727096137.55033: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 12613 1727096137.55035: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 12613 1727096137.55037: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 12613 1727096137.55039: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c46faa0> <<< 12613 1727096137.55166: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c46fe00> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c89a450> <<< 12613 1727096137.55172: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c8b2420> <<< 12613 1727096137.55174: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c8b3f20> <<< 12613 1727096137.55177: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c8b39e0> <<< 12613 1727096137.55463: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c486f30> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c4867e0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096137.55466: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c4869c0> <<< 12613 1727096137.55470: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c485c40> <<< 12613 1727096137.55472: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 12613 1727096137.55547: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 12613 1727096137.55555: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c487110> <<< 12613 1727096137.55873: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c4ddbb0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c487bc0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c8b0260> import 'ansible.module_utils.facts.timeout' # <<< 12613 1727096137.55899: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 12613 1727096137.55927: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.55996: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 12613 1727096137.56000: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.56256: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 12613 1727096137.56260: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available <<< 12613 1727096137.56281: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 12613 1727096137.56287: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.56415: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.56422: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.56476: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.56543: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 12613 1727096137.56549: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.57458: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.57547: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available <<< 12613 1727096137.57691: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # <<< 12613 1727096137.57825: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available <<< 12613 1727096137.57852: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 12613 1727096137.57925: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.58025: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 12613 1727096137.58252: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.58379: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c4dfd70> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 12613 1727096137.58382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 12613 1727096137.58555: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c4de660> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 12613 1727096137.58682: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.59055: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available <<< 12613 1727096137.59059: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 12613 1727096137.59116: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096137.59172: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c51ddc0> <<< 12613 1727096137.59445: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c501ac0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available <<< 12613 1727096137.59486: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 12613 1727096137.59492: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.59690: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12613 1727096137.59780: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.59930: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 12613 1727096137.59936: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.59984: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.60151: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 12613 1727096137.60163: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096137.60211: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096137.60221: stdout chunk (state=3): >>>import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c5258e0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c51dbb0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 12613 1727096137.60258: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 12613 1727096137.60473: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 12613 1727096137.60497: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.60756: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 12613 1727096137.60759: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.61179: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 12613 1727096137.61340: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 12613 1727096137.61470: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.61940: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 12613 1727096137.62197: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.62716: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 12613 1727096137.62724: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 12613 1727096137.62882: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.63084: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 12613 1727096137.63240: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 12613 1727096137.63385: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.63472: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 12613 1727096137.63492: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.63508: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 12613 1727096137.63515: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.63572: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.63608: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 12613 1727096137.63616: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.63720: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.63843: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.64145: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.64305: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available <<< 12613 1727096137.64309: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 12613 1727096137.64388: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.64391: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.64394: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 12613 1727096137.64555: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.64559: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 12613 1727096137.64561: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.64569: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.64731: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 12613 1727096137.64748: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.64754: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 12613 1727096137.64813: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.64947: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 12613 1727096137.65120: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.65502: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available <<< 12613 1727096137.65698: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 12613 1727096137.65719: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.65820: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 12613 1727096137.65870: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12613 1727096137.66101: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available <<< 12613 1727096137.66116: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.66155: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.66282: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.66510: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 12613 1727096137.66666: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.66888: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 12613 1727096137.66928: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.66986: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 12613 1727096137.67023: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.67212: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 12613 1727096137.67261: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # <<< 12613 1727096137.67265: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 12613 1727096137.67269: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.67393: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.67466: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 12613 1727096137.67602: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096137.68132: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 12613 1727096137.68153: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 12613 1727096137.68166: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 12613 1727096137.68220: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c2ba120> <<< 12613 1727096137.68236: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c2b88f0> <<< 12613 1727096137.68335: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c2b3bf0> <<< 12613 1727096137.80876: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c301910> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 12613 1727096137.80884: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c300d40> <<< 12613 1727096137.80934: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 12613 1727096137.80980: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c302f90> <<< 12613 1727096137.80994: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c301bb0> <<< 12613 1727096137.81263: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 12613 1727096138.06172: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.67041015625, "5m": 0.32763671875, "15m": 0.13818359375}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2951, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 580, "free": 2951}, "nocache": {"free": 3287, "used": 244}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 290, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797634048, "block_size": 4096, "block_total": 65519099, "block_available": 63915438, "block_used": 1603661, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "38", "epoch": "1727096138", "epoch_int": "1727096138", "date": "2024-09-23", "time": "08:55:38", "iso8601_micro": "2024-09-23T12:55:38.011324Z", "iso8601": "2024-09-23T12:55:38Z", "iso8601_basic": "20240923T085538011324", "iso8601_basic_short": "20240923T085538", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12613 1727096138.07291: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanu<<< 12613 1727096138.07443: stdout chunk (state=3): >>>p[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing<<< 12613 1727096138.07599: stdout chunk (state=3): >>> ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 12613 1727096138.07911: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib<<< 12613 1727096138.07947: stdout chunk (state=3): >>> # destroy bz2 # destroy lzma <<< 12613 1727096138.07950: stdout chunk (state=3): >>># destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 12613 1727096138.07985: stdout chunk (state=3): >>># destroy ntpath <<< 12613 1727096138.08138: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 12613 1727096138.08250: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 12613 1727096138.08253: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime <<< 12613 1727096138.08310: stdout chunk (state=3): >>># destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd <<< 12613 1727096138.08465: stdout chunk (state=3): >>># destroy termios <<< 12613 1727096138.08521: stdout chunk (state=3): >>># destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 12613 1727096138.08550: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 12613 1727096138.08686: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 12613 1727096138.08904: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 12613 1727096138.08924: stdout chunk (state=3): >>># destroy _collections <<< 12613 1727096138.09019: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 12613 1727096138.09039: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 12613 1727096138.09069: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 12613 1727096138.09337: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 12613 1727096138.09849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 12613 1727096138.09884: stderr chunk (state=3): >>><<< 12613 1727096138.09890: stdout chunk (state=3): >>><<< 12613 1727096138.10016: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d4184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d3e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d41aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d22d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d22dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d26be00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d26bec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2a37d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2a3e60> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d283ad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2811f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d268fb0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2c3770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2c2390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d282090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2c0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2f8800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d268230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97d2f8cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2f8b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97d2f8ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d266d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2f9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2f9250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2fa480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d3106b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97d311d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d312c30> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97d313290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d312180> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97d313d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d313440> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2fa4e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97d047bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97d0706e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d070440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97d070620> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97d070fe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97d071970> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d070890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d045d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d072cf0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d070e60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2fabd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d09f020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d0c33e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d1201a0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d1228d0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d1202c0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d0ed190> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97cf251f0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d0c21e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d073bf0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc97d0c28a0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_25e1cyeb/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97cf86f00> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97cf65df0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97cf64f50> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97cf84c50> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97cfbe7e0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97cfbe570> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97cfbde80> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97cfbe5d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97d2829c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97cfbf4d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97cfbf710> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97cfbfc50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c9259d0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c9275f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c927f80> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c929160> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c92bb90> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c92bec0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c929e50> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c933a40> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c932510> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c932270> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c932780> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c92a360> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c977c50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c977cb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c979820> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c9795e0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c97bd70> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c979ee0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c97f4a0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c97be30> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c980230> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c980260> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c9805c0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c977fe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c983ce0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c80cfb0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c982480> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c983800> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c982090> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c811190> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c812030> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c80d100> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c811f10> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c813020> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c81daf0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c81ac60> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c906510> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c9fe1e0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c81db80> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c9806e0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c8b18b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c46faa0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c46fe00> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c89a450> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c8b2420> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c8b3f20> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c8b39e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c486f30> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c4867e0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c4869c0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c485c40> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c487110> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c4ddbb0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c487bc0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c8b0260> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c4dfd70> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c4de660> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c51ddc0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c501ac0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c5258e0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c51dbb0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc97c2ba120> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c2b88f0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c2b3bf0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c301910> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c300d40> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c302f90> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc97c301bb0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.67041015625, "5m": 0.32763671875, "15m": 0.13818359375}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2951, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 580, "free": 2951}, "nocache": {"free": 3287, "used": 244}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 290, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797634048, "block_size": 4096, "block_total": 65519099, "block_available": 63915438, "block_used": 1603661, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "38", "epoch": "1727096138", "epoch_int": "1727096138", "date": "2024-09-23", "time": "08:55:38", "iso8601_micro": "2024-09-23T12:55:38.011324Z", "iso8601": "2024-09-23T12:55:38Z", "iso8601_basic": "20240923T085538011324", "iso8601_basic_short": "20240923T085538", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 12613 1727096138.11885: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096136.4284427-12640-68497737261898/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12613 1727096138.11888: _low_level_execute_command(): starting 12613 1727096138.11890: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096136.4284427-12640-68497737261898/ > /dev/null 2>&1 && sleep 0' 12613 1727096138.11892: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12613 1727096138.11894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12613 1727096138.11896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12613 1727096138.11898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12613 1727096138.11900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 12613 1727096138.11902: stderr chunk (state=3): >>>debug2: match not found <<< 12613 1727096138.11904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12613 1727096138.11906: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12613 1727096138.11908: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 12613 1727096138.12030: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12613 1727096138.12037: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12613 1727096138.12040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12613 1727096138.12042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12613 1727096138.12044: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 12613 1727096138.12046: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 12613 1727096138.12153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12613 1727096138.12406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12613 1727096138.14319: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12613 1727096138.14575: stderr chunk (state=3): >>><<< 12613 1727096138.14579: stdout chunk (state=3): >>><<< 12613 1727096138.14582: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12613 1727096138.14584: handler run complete 12613 1727096138.14586: variable 'ansible_facts' from source: unknown 12613 1727096138.14630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096138.15914: variable 'ansible_facts' from source: unknown 12613 1727096138.16000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096138.16157: attempt loop complete, returning result 12613 1727096138.16161: _execute() done 12613 1727096138.16163: dumping result to json 12613 1727096138.16166: done dumping result, returning 12613 1727096138.16175: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-a9dd-d073-0000000001bc] 12613 1727096138.16180: sending task result for task 0afff68d-5257-a9dd-d073-0000000001bc 12613 1727096138.16685: done sending task result for task 0afff68d-5257-a9dd-d073-0000000001bc 12613 1727096138.16689: WORKER PROCESS EXITING ok: [managed_node1] 12613 1727096138.17283: no more pending results, returning what we have 12613 1727096138.17287: results queue empty 12613 1727096138.17288: checking for any_errors_fatal 12613 1727096138.17289: done checking for any_errors_fatal 12613 1727096138.17290: checking for max_fail_percentage 12613 1727096138.17292: done checking for max_fail_percentage 12613 1727096138.17292: checking to see if all hosts have failed and the running result is not ok 12613 1727096138.17293: done checking to see if all hosts have failed 12613 1727096138.17294: getting the remaining hosts for this loop 12613 1727096138.17296: done getting the remaining hosts for this loop 12613 1727096138.17300: getting the next task for host managed_node1 12613 1727096138.17307: done getting next task for host managed_node1 12613 1727096138.17308: ^ task is: TASK: meta (flush_handlers) 12613 1727096138.17310: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096138.17315: getting variables 12613 1727096138.17316: in VariableManager get_vars() 12613 1727096138.17341: Calling all_inventory to load vars for managed_node1 12613 1727096138.17472: Calling groups_inventory to load vars for managed_node1 12613 1727096138.17477: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096138.17487: Calling all_plugins_play to load vars for managed_node1 12613 1727096138.17490: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096138.17502: Calling groups_plugins_play to load vars for managed_node1 12613 1727096138.17911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096138.18296: done with get_vars() 12613 1727096138.18310: done getting variables 12613 1727096138.18484: in VariableManager get_vars() 12613 1727096138.18495: Calling all_inventory to load vars for managed_node1 12613 1727096138.18497: Calling groups_inventory to load vars for managed_node1 12613 1727096138.18500: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096138.18505: Calling all_plugins_play to load vars for managed_node1 12613 1727096138.18507: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096138.18509: Calling groups_plugins_play to load vars for managed_node1 12613 1727096138.18712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096138.19073: done with get_vars() 12613 1727096138.19091: done queuing things up, now waiting for results queue to drain 12613 1727096138.19093: results queue empty 12613 1727096138.19094: checking for any_errors_fatal 12613 1727096138.19096: done checking for any_errors_fatal 12613 1727096138.19097: checking for max_fail_percentage 12613 1727096138.19098: done checking for max_fail_percentage 12613 1727096138.19098: checking to see if all hosts have failed and the running result is not ok 12613 1727096138.19099: done checking to see if all hosts have failed 12613 1727096138.19100: getting the remaining hosts for this loop 12613 1727096138.19100: done getting the remaining hosts for this loop 12613 1727096138.19103: getting the next task for host managed_node1 12613 1727096138.19107: done getting next task for host managed_node1 12613 1727096138.19116: ^ task is: TASK: Include the task 'el_repo_setup.yml' 12613 1727096138.19117: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096138.19119: getting variables 12613 1727096138.19120: in VariableManager get_vars() 12613 1727096138.19128: Calling all_inventory to load vars for managed_node1 12613 1727096138.19130: Calling groups_inventory to load vars for managed_node1 12613 1727096138.19131: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096138.19137: Calling all_plugins_play to load vars for managed_node1 12613 1727096138.19138: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096138.19141: Calling groups_plugins_play to load vars for managed_node1 12613 1727096138.19281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096138.19478: done with get_vars() 12613 1727096138.19488: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_initscripts.yml:10 Monday 23 September 2024 08:55:38 -0400 (0:00:01.820) 0:00:01.832 ****** 12613 1727096138.19581: entering _queue_task() for managed_node1/include_tasks 12613 1727096138.19583: Creating lock for include_tasks 12613 1727096138.19952: worker is 1 (out of 1 available) 12613 1727096138.19971: exiting _queue_task() for managed_node1/include_tasks 12613 1727096138.19990: done queuing things up, now waiting for results queue to drain 12613 1727096138.19991: waiting for pending results... 12613 1727096138.20657: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 12613 1727096138.20692: in run() - task 0afff68d-5257-a9dd-d073-000000000006 12613 1727096138.20711: variable 'ansible_search_path' from source: unknown 12613 1727096138.20970: calling self._execute() 12613 1727096138.20995: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096138.21006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096138.21093: variable 'omit' from source: magic vars 12613 1727096138.21329: _execute() done 12613 1727096138.21339: dumping result to json 12613 1727096138.21419: done dumping result, returning 12613 1727096138.21442: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [0afff68d-5257-a9dd-d073-000000000006] 12613 1727096138.21522: sending task result for task 0afff68d-5257-a9dd-d073-000000000006 12613 1727096138.21609: done sending task result for task 0afff68d-5257-a9dd-d073-000000000006 12613 1727096138.21613: WORKER PROCESS EXITING 12613 1727096138.21666: no more pending results, returning what we have 12613 1727096138.21674: in VariableManager get_vars() 12613 1727096138.21710: Calling all_inventory to load vars for managed_node1 12613 1727096138.21714: Calling groups_inventory to load vars for managed_node1 12613 1727096138.21718: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096138.21732: Calling all_plugins_play to load vars for managed_node1 12613 1727096138.21735: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096138.21738: Calling groups_plugins_play to load vars for managed_node1 12613 1727096138.22215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096138.22808: done with get_vars() 12613 1727096138.22818: variable 'ansible_search_path' from source: unknown 12613 1727096138.22835: we have included files to process 12613 1727096138.22837: generating all_blocks data 12613 1727096138.22838: done generating all_blocks data 12613 1727096138.22839: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 12613 1727096138.22840: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 12613 1727096138.22842: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 12613 1727096138.23740: in VariableManager get_vars() 12613 1727096138.23761: done with get_vars() 12613 1727096138.23775: done processing included file 12613 1727096138.23778: iterating over new_blocks loaded from include file 12613 1727096138.23780: in VariableManager get_vars() 12613 1727096138.23790: done with get_vars() 12613 1727096138.23792: filtering new block on tags 12613 1727096138.23807: done filtering new block on tags 12613 1727096138.23811: in VariableManager get_vars() 12613 1727096138.23820: done with get_vars() 12613 1727096138.23822: filtering new block on tags 12613 1727096138.23840: done filtering new block on tags 12613 1727096138.23842: in VariableManager get_vars() 12613 1727096138.23856: done with get_vars() 12613 1727096138.23858: filtering new block on tags 12613 1727096138.23873: done filtering new block on tags 12613 1727096138.23876: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 12613 1727096138.23882: extending task lists for all hosts with included blocks 12613 1727096138.23929: done extending task lists 12613 1727096138.23930: done processing included files 12613 1727096138.23931: results queue empty 12613 1727096138.23932: checking for any_errors_fatal 12613 1727096138.23933: done checking for any_errors_fatal 12613 1727096138.23934: checking for max_fail_percentage 12613 1727096138.23935: done checking for max_fail_percentage 12613 1727096138.23935: checking to see if all hosts have failed and the running result is not ok 12613 1727096138.23936: done checking to see if all hosts have failed 12613 1727096138.23937: getting the remaining hosts for this loop 12613 1727096138.23938: done getting the remaining hosts for this loop 12613 1727096138.23940: getting the next task for host managed_node1 12613 1727096138.23944: done getting next task for host managed_node1 12613 1727096138.23946: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 12613 1727096138.23948: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096138.23953: getting variables 12613 1727096138.23955: in VariableManager get_vars() 12613 1727096138.23963: Calling all_inventory to load vars for managed_node1 12613 1727096138.23966: Calling groups_inventory to load vars for managed_node1 12613 1727096138.23969: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096138.23975: Calling all_plugins_play to load vars for managed_node1 12613 1727096138.23977: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096138.23981: Calling groups_plugins_play to load vars for managed_node1 12613 1727096138.24147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096138.24330: done with get_vars() 12613 1727096138.24340: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Monday 23 September 2024 08:55:38 -0400 (0:00:00.048) 0:00:01.881 ****** 12613 1727096138.24415: entering _queue_task() for managed_node1/setup 12613 1727096138.24747: worker is 1 (out of 1 available) 12613 1727096138.24762: exiting _queue_task() for managed_node1/setup 12613 1727096138.24775: done queuing things up, now waiting for results queue to drain 12613 1727096138.24776: waiting for pending results... 12613 1727096138.25199: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 12613 1727096138.25208: in run() - task 0afff68d-5257-a9dd-d073-0000000001cd 12613 1727096138.25212: variable 'ansible_search_path' from source: unknown 12613 1727096138.25214: variable 'ansible_search_path' from source: unknown 12613 1727096138.25217: calling self._execute() 12613 1727096138.25291: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096138.25303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096138.25317: variable 'omit' from source: magic vars 12613 1727096138.26181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096138.28693: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096138.28744: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096138.28785: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096138.28813: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096138.28834: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096138.28907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096138.28990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096138.28994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096138.28996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096138.28998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096138.29163: variable 'ansible_facts' from source: unknown 12613 1727096138.29192: variable 'network_test_required_facts' from source: task vars 12613 1727096138.29227: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 12613 1727096138.29231: variable 'omit' from source: magic vars 12613 1727096138.29260: variable 'omit' from source: magic vars 12613 1727096138.29287: variable 'omit' from source: magic vars 12613 1727096138.29312: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12613 1727096138.29335: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12613 1727096138.29355: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12613 1727096138.29366: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12613 1727096138.29376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12613 1727096138.29399: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12613 1727096138.29402: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096138.29404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096138.29477: Set connection var ansible_connection to ssh 12613 1727096138.29484: Set connection var ansible_module_compression to ZIP_DEFLATED 12613 1727096138.29491: Set connection var ansible_timeout to 10 12613 1727096138.29497: Set connection var ansible_shell_type to sh 12613 1727096138.29502: Set connection var ansible_pipelining to False 12613 1727096138.29507: Set connection var ansible_shell_executable to /bin/sh 12613 1727096138.29527: variable 'ansible_shell_executable' from source: unknown 12613 1727096138.29529: variable 'ansible_connection' from source: unknown 12613 1727096138.29535: variable 'ansible_module_compression' from source: unknown 12613 1727096138.29538: variable 'ansible_shell_type' from source: unknown 12613 1727096138.29541: variable 'ansible_shell_executable' from source: unknown 12613 1727096138.29543: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096138.29545: variable 'ansible_pipelining' from source: unknown 12613 1727096138.29548: variable 'ansible_timeout' from source: unknown 12613 1727096138.29549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096138.29653: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12613 1727096138.29659: variable 'omit' from source: magic vars 12613 1727096138.29673: starting attempt loop 12613 1727096138.29676: running the handler 12613 1727096138.29684: _low_level_execute_command(): starting 12613 1727096138.29692: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12613 1727096138.30256: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12613 1727096138.30265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12613 1727096138.30363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12613 1727096138.30415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12613 1727096138.32506: stdout chunk (state=3): >>>/root <<< 12613 1727096138.32648: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12613 1727096138.32692: stderr chunk (state=3): >>><<< 12613 1727096138.32695: stdout chunk (state=3): >>><<< 12613 1727096138.32711: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12613 1727096138.32755: _low_level_execute_command(): starting 12613 1727096138.32759: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096138.3272693-12717-197936071283665 `" && echo ansible-tmp-1727096138.3272693-12717-197936071283665="` echo /root/.ansible/tmp/ansible-tmp-1727096138.3272693-12717-197936071283665 `" ) && sleep 0' 12613 1727096138.33239: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12613 1727096138.33243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12613 1727096138.33245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 12613 1727096138.33248: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 12613 1727096138.33250: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12613 1727096138.33310: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 12613 1727096138.33313: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12613 1727096138.33318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12613 1727096138.33397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12613 1727096138.36315: stdout chunk (state=3): >>>ansible-tmp-1727096138.3272693-12717-197936071283665=/root/.ansible/tmp/ansible-tmp-1727096138.3272693-12717-197936071283665 <<< 12613 1727096138.36799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12613 1727096138.36803: stdout chunk (state=3): >>><<< 12613 1727096138.36805: stderr chunk (state=3): >>><<< 12613 1727096138.36807: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096138.3272693-12717-197936071283665=/root/.ansible/tmp/ansible-tmp-1727096138.3272693-12717-197936071283665 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12613 1727096138.36809: variable 'ansible_module_compression' from source: unknown 12613 1727096138.36811: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12613tatu8w7b/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12613 1727096138.36871: variable 'ansible_facts' from source: unknown 12613 1727096138.37111: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096138.3272693-12717-197936071283665/AnsiballZ_setup.py 12613 1727096138.37293: Sending initial data 12613 1727096138.37357: Sent initial data (154 bytes) 12613 1727096138.38353: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12613 1727096138.38442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12613 1727096138.38584: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12613 1727096138.38777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 12613 1727096138.38821: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12613 1727096138.38890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12613 1727096138.41233: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12613 1727096138.41261: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12613 1727096138.41364: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12613 1727096138.41469: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12613tatu8w7b/tmphc4j11y_ /root/.ansible/tmp/ansible-tmp-1727096138.3272693-12717-197936071283665/AnsiballZ_setup.py <<< 12613 1727096138.41479: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096138.3272693-12717-197936071283665/AnsiballZ_setup.py" <<< 12613 1727096138.41531: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-12613tatu8w7b/tmphc4j11y_" to remote "/root/.ansible/tmp/ansible-tmp-1727096138.3272693-12717-197936071283665/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096138.3272693-12717-197936071283665/AnsiballZ_setup.py" <<< 12613 1727096138.44674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12613 1727096138.44679: stdout chunk (state=3): >>><<< 12613 1727096138.44682: stderr chunk (state=3): >>><<< 12613 1727096138.44684: done transferring module to remote 12613 1727096138.44686: _low_level_execute_command(): starting 12613 1727096138.44688: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096138.3272693-12717-197936071283665/ /root/.ansible/tmp/ansible-tmp-1727096138.3272693-12717-197936071283665/AnsiballZ_setup.py && sleep 0' 12613 1727096138.45930: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12613 1727096138.46157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12613 1727096138.46161: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 12613 1727096138.46187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12613 1727096138.46233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12613 1727096138.46377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 12613 1727096138.49247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12613 1727096138.49251: stdout chunk (state=3): >>><<< 12613 1727096138.49254: stderr chunk (state=3): >>><<< 12613 1727096138.49256: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 12613 1727096138.49258: _low_level_execute_command(): starting 12613 1727096138.49261: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096138.3272693-12717-197936071283665/AnsiballZ_setup.py && sleep 0' 12613 1727096138.50599: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12613 1727096138.50614: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12613 1727096138.50627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12613 1727096138.50665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12613 1727096138.50847: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 12613 1727096138.50998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12613 1727096138.51109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 12613 1727096138.54435: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # <<< 12613 1727096138.54470: stdout chunk (state=3): >>> import '_warnings' # import '_weakref' # <<< 12613 1727096138.54605: stdout chunk (state=3): >>>import '_io' # <<< 12613 1727096138.54609: stdout chunk (state=3): >>>import 'marshal' # <<< 12613 1727096138.54720: stdout chunk (state=3): >>>import 'posix' # <<< 12613 1727096138.54745: stdout chunk (state=3): >>> import '_frozen_importlib_external' # # installing zipimport hook <<< 12613 1727096138.54765: stdout chunk (state=3): >>>import 'time' # <<< 12613 1727096138.54831: stdout chunk (state=3): >>> import 'zipimport' # <<< 12613 1727096138.54876: stdout chunk (state=3): >>> # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py<<< 12613 1727096138.54945: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 12613 1727096138.54960: stdout chunk (state=3): >>>import '_codecs' # <<< 12613 1727096138.55036: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py<<< 12613 1727096138.55163: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc'<<< 12613 1727096138.55166: stdout chunk (state=3): >>> <<< 12613 1727096138.55194: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cf104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cedfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cf12a50> import '_signal' # import '_abc' # import 'abc' # <<< 12613 1727096138.55224: stdout chunk (state=3): >>>import 'io' # <<< 12613 1727096138.55272: stdout chunk (state=3): >>> import '_stat' # <<< 12613 1727096138.55373: stdout chunk (state=3): >>> import 'stat' # <<< 12613 1727096138.55501: stdout chunk (state=3): >>> import '_collections_abc' # import 'genericpath' # <<< 12613 1727096138.55520: stdout chunk (state=3): >>>import 'posixpath' # <<< 12613 1727096138.55602: stdout chunk (state=3): >>>import 'os' # <<< 12613 1727096138.55621: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 12613 1727096138.55682: stdout chunk (state=3): >>>Processing user site-packages <<< 12613 1727096138.55712: stdout chunk (state=3): >>>Processing global site-packages<<< 12613 1727096138.55724: stdout chunk (state=3): >>> Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth'<<< 12613 1727096138.55765: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 12613 1727096138.55821: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cce5130> <<< 12613 1727096138.56109: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py<<< 12613 1727096138.56112: stdout chunk (state=3): >>> # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc'<<< 12613 1727096138.56115: stdout chunk (state=3): >>> import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cce5fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 12613 1727096138.56717: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 12613 1727096138.56733: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc'<<< 12613 1727096138.57004: stdout chunk (state=3): >>> # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 12613 1727096138.57025: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd23ec0><<< 12613 1727096138.57048: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 12613 1727096138.57076: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc'<<< 12613 1727096138.57124: stdout chunk (state=3): >>> import '_operator' # <<< 12613 1727096138.57145: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd23f80> <<< 12613 1727096138.57179: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 12613 1727096138.57261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 12613 1727096138.57344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc'<<< 12613 1727096138.57380: stdout chunk (state=3): >>> import 'itertools' # <<< 12613 1727096138.57437: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc'<<< 12613 1727096138.57465: stdout chunk (state=3): >>> import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd5b830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 12613 1727096138.57538: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc'<<< 12613 1727096138.57549: stdout chunk (state=3): >>> import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd5bec0> import '_collections' # <<< 12613 1727096138.57761: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd3bb60> import '_functools' # <<< 12613 1727096138.57765: stdout chunk (state=3): >>> import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd392b0> <<< 12613 1727096138.57864: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd21070> <<< 12613 1727096138.57985: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 12613 1727096138.58006: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 12613 1727096138.58026: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 12613 1727096138.58089: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd7b7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd7a3f0> <<< 12613 1727096138.58109: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 12613 1727096138.58191: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 12613 1727096138.58217: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd3a150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd78bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdb0890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd202f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 12613 1727096138.58324: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8cdb0d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdb0bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8cdb0fe0> <<< 12613 1727096138.58334: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd1ee10> <<< 12613 1727096138.58417: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 12613 1727096138.58430: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdb1670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdb1370> import 'importlib.machinery' # <<< 12613 1727096138.58507: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdb2540> import 'importlib.util' # import 'runpy' # <<< 12613 1727096138.58600: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 12613 1727096138.58629: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdc8740> <<< 12613 1727096138.58657: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8cdc9e20> <<< 12613 1727096138.58763: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 12613 1727096138.58768: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdcacc0> <<< 12613 1727096138.58784: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8cdcb2f0> <<< 12613 1727096138.58807: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdca210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 12613 1727096138.58863: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8cdcbd70> <<< 12613 1727096138.58922: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdcb4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdb24b0> <<< 12613 1727096138.58959: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 12613 1727096138.58994: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 12613 1727096138.59053: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 12613 1727096138.59059: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 12613 1727096138.59160: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8cabfc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 12613 1727096138.59164: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8cae87a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cae8500> <<< 12613 1727096138.59440: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8cae87d0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8cae9100> <<< 12613 1727096138.59623: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8cae9af0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cae89b0> <<< 12613 1727096138.59646: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cabddf0> <<< 12613 1727096138.59669: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 12613 1727096138.59901: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8caeaf00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cae9c40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdb2c60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 12613 1727096138.59951: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 12613 1727096138.59983: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cb13230> <<< 12613 1727096138.60048: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 12613 1727096138.60094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 12613 1727096138.60108: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 12613 1727096138.60120: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 12613 1727096138.60170: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cb375f0> <<< 12613 1727096138.60250: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 12613 1727096138.60322: stdout chunk (state=3): >>>import 'ntpath' # <<< 12613 1727096138.60364: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cb98380> <<< 12613 1727096138.60380: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 12613 1727096138.60415: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 12613 1727096138.60436: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 12613 1727096138.60613: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cb9aae0> <<< 12613 1727096138.60729: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cb984a0> <<< 12613 1727096138.60775: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cb59370> <<< 12613 1727096138.60815: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c9a5430> <<< 12613 1727096138.60829: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cb363f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8caebe00> <<< 12613 1727096138.61181: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f1d8cb36750> <<< 12613 1727096138.61619: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_q_v8meid/ansible_setup_payload.zip' # zipimport: zlib available <<< 12613 1727096138.61713: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.61746: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 12613 1727096138.61765: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 12613 1727096138.61814: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 12613 1727096138.61916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 12613 1727096138.61951: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8ca0f170> import '_typing' # <<< 12613 1727096138.62275: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c9ee060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c9ed1c0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available <<< 12613 1727096138.62309: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 12613 1727096138.62489: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.64562: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.66428: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8ca0d010> <<< 12613 1727096138.66470: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 12613 1727096138.66493: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 12613 1727096138.66522: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 12613 1727096138.66532: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8ca3ea20> <<< 12613 1727096138.66580: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8ca3e7b0> <<< 12613 1727096138.66631: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8ca3e0f0> <<< 12613 1727096138.66652: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 12613 1727096138.66717: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8ca3e540> <<< 12613 1727096138.66731: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8ca0fb90> import 'atexit' # <<< 12613 1727096138.66771: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8ca3f7d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8ca3fa10> <<< 12613 1727096138.66855: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 12613 1727096138.66876: stdout chunk (state=3): >>>import '_locale' # <<< 12613 1727096138.66939: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8ca3ff50> <<< 12613 1727096138.66964: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 12613 1727096138.66997: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 12613 1727096138.67062: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c331d00> <<< 12613 1727096138.67084: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096138.67115: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c333920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 12613 1727096138.67118: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 12613 1727096138.67184: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c3342f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 12613 1727096138.67217: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 12613 1727096138.67255: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c335490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 12613 1727096138.67307: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 12613 1727096138.67406: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c337f80> <<< 12613 1727096138.67453: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c33c2f0> <<< 12613 1727096138.67474: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c336240> <<< 12613 1727096138.67502: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 12613 1727096138.67587: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 12613 1727096138.67605: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 12613 1727096138.67805: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 12613 1727096138.67817: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c33fef0> <<< 12613 1727096138.68123: stdout chunk (state=3): >>>import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c33e9c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c33e720> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c33ec90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c336750> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c383f50> <<< 12613 1727096138.68127: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c384230> <<< 12613 1727096138.68170: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 12613 1727096138.68234: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096138.68315: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c385cd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c385a90> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 12613 1727096138.68491: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c388260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c386390> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 12613 1727096138.68696: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c38b9b0> <<< 12613 1727096138.68744: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c3883b0> <<< 12613 1727096138.68787: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c38c7a0> <<< 12613 1727096138.68827: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c38c9b0> <<< 12613 1727096138.68969: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c38ccb0> <<< 12613 1727096138.68997: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c384350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096138.69036: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c218380> <<< 12613 1727096138.69311: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c219430> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c38eb10> <<< 12613 1727096138.69401: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c38fec0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c38e750> # zipimport: zlib available <<< 12613 1727096138.69405: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.69425: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 12613 1727096138.69511: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.69857: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 12613 1727096138.69955: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12613 1727096138.70054: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.70958: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.71854: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 12613 1727096138.71979: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 12613 1727096138.72001: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c21d760> <<< 12613 1727096138.72100: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 12613 1727096138.72108: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c21e450> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c219580> <<< 12613 1727096138.72179: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 12613 1727096138.72285: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # <<< 12613 1727096138.72289: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.72533: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.72759: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c21e600> # zipimport: zlib available <<< 12613 1727096138.73607: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.74189: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.74288: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.74397: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 12613 1727096138.74401: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.74441: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.74497: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 12613 1727096138.74501: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.74586: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.74725: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 12613 1727096138.74732: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.74770: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 12613 1727096138.74804: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.74877: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 12613 1727096138.75391: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.75572: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 12613 1727096138.75671: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 12613 1727096138.75693: stdout chunk (state=3): >>>import '_ast' # <<< 12613 1727096138.75789: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c21f890> # zipimport: zlib available <<< 12613 1727096138.75879: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.75989: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 12613 1727096138.76005: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 12613 1727096138.76027: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.76084: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.76128: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 12613 1727096138.76131: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.76197: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.76243: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.76371: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.76441: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 12613 1727096138.76583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096138.76593: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c22a270> <<< 12613 1727096138.76635: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c225ac0> <<< 12613 1727096138.76671: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 12613 1727096138.76682: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.76766: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.76852: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.76894: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.76932: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 12613 1727096138.76955: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 12613 1727096138.76959: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 12613 1727096138.76994: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 12613 1727096138.77088: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 12613 1727096138.77109: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 12613 1727096138.77127: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 12613 1727096138.77213: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c302ab0> <<< 12613 1727096138.77266: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c3fe780> <<< 12613 1727096138.77377: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c22a2d0> <<< 12613 1727096138.77398: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c21f110> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 12613 1727096138.77424: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.77533: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # <<< 12613 1727096138.77543: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.77560: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.77575: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 12613 1727096138.77582: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.77657: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.77747: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.77764: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.77903: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12613 1727096138.77952: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.77994: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.78090: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 12613 1727096138.78095: stdout chunk (state=3): >>> <<< 12613 1727096138.78110: stdout chunk (state=3): >>># zipimport: zlib available<<< 12613 1727096138.78234: stdout chunk (state=3): >>> # zipimport: zlib available<<< 12613 1727096138.78263: stdout chunk (state=3): >>> <<< 12613 1727096138.78396: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.78442: stdout chunk (state=3): >>># zipimport: zlib available<<< 12613 1727096138.78447: stdout chunk (state=3): >>> <<< 12613 1727096138.78511: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 12613 1727096138.78535: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.78846: stdout chunk (state=3): >>># zipimport: zlib available<<< 12613 1727096138.79094: stdout chunk (state=3): >>> <<< 12613 1727096138.79143: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.79215: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.79318: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 12613 1727096138.79321: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 12613 1727096138.79395: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 12613 1727096138.79410: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 12613 1727096138.79442: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py<<< 12613 1727096138.79455: stdout chunk (state=3): >>> <<< 12613 1727096138.79513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 12613 1727096138.79536: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c2ba4b0> <<< 12613 1727096138.79617: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 12613 1727096138.79648: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 12613 1727096138.79664: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 12613 1727096138.79763: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc'<<< 12613 1727096138.79791: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py<<< 12613 1727096138.79806: stdout chunk (state=3): >>> <<< 12613 1727096138.79812: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 12613 1727096138.79843: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8befc290><<< 12613 1727096138.79912: stdout chunk (state=3): >>> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so'<<< 12613 1727096138.79944: stdout chunk (state=3): >>> # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096138.80044: stdout chunk (state=3): >>>import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8befc5f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c2a4230> <<< 12613 1727096138.80086: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c2bafc0><<< 12613 1727096138.80092: stdout chunk (state=3): >>> <<< 12613 1727096138.80140: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c2b8bf0> <<< 12613 1727096138.80173: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c2b87d0> <<< 12613 1727096138.80212: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py<<< 12613 1727096138.80217: stdout chunk (state=3): >>> <<< 12613 1727096138.80320: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 12613 1727096138.80344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc'<<< 12613 1727096138.80385: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py<<< 12613 1727096138.80395: stdout chunk (state=3): >>> <<< 12613 1727096138.80440: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so'<<< 12613 1727096138.80458: stdout chunk (state=3): >>> # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so'<<< 12613 1727096138.80474: stdout chunk (state=3): >>> import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8beff560> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8befee10><<< 12613 1727096138.80523: stdout chunk (state=3): >>> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so'<<< 12613 1727096138.80529: stdout chunk (state=3): >>> <<< 12613 1727096138.80548: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so'<<< 12613 1727096138.80555: stdout chunk (state=3): >>> import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8befef90><<< 12613 1727096138.80591: stdout chunk (state=3): >>> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8befe240><<< 12613 1727096138.80596: stdout chunk (state=3): >>> <<< 12613 1727096138.80625: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 12613 1727096138.80847: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8beff680><<< 12613 1727096138.80884: stdout chunk (state=3): >>> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py<<< 12613 1727096138.80887: stdout chunk (state=3): >>> <<< 12613 1727096138.80983: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so'<<< 12613 1727096138.80995: stdout chunk (state=3): >>> <<< 12613 1727096138.81007: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so'<<< 12613 1727096138.81013: stdout chunk (state=3): >>> import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8bf4a180><<< 12613 1727096138.81064: stdout chunk (state=3): >>> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8bf481a0><<< 12613 1727096138.81070: stdout chunk (state=3): >>> <<< 12613 1727096138.81114: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c2b8800><<< 12613 1727096138.81117: stdout chunk (state=3): >>> <<< 12613 1727096138.81133: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 12613 1727096138.81162: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.collector' # <<< 12613 1727096138.81192: stdout chunk (state=3): >>> # zipimport: zlib available<<< 12613 1727096138.81197: stdout chunk (state=3): >>> <<< 12613 1727096138.81220: stdout chunk (state=3): >>># zipimport: zlib available<<< 12613 1727096138.81265: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 12613 1727096138.81370: stdout chunk (state=3): >>># zipimport: zlib available<<< 12613 1727096138.81459: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.other.facter' # <<< 12613 1727096138.81489: stdout chunk (state=3): >>> <<< 12613 1727096138.81511: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.81611: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.81677: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 12613 1727096138.81708: stdout chunk (state=3): >>># zipimport: zlib available<<< 12613 1727096138.81735: stdout chunk (state=3): >>> # zipimport: zlib available <<< 12613 1727096138.81778: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 12613 1727096138.81871: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # <<< 12613 1727096138.81895: stdout chunk (state=3): >>># zipimport: zlib available<<< 12613 1727096138.81983: stdout chunk (state=3): >>> # zipimport: zlib available <<< 12613 1727096138.82074: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 12613 1727096138.82095: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.82154: stdout chunk (state=3): >>># zipimport: zlib available<<< 12613 1727096138.82157: stdout chunk (state=3): >>> <<< 12613 1727096138.82211: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 12613 1727096138.82249: stdout chunk (state=3): >>> # zipimport: zlib available <<< 12613 1727096138.82550: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.82556: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12613 1727096138.82559: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 12613 1727096138.83195: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.83499: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 12613 1727096138.83515: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.83896: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # <<< 12613 1727096138.83902: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12613 1727096138.83975: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 12613 1727096138.84001: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.84037: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.84078: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 12613 1727096138.84088: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.84133: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.84166: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 12613 1727096138.84228: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.84354: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.84429: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 12613 1727096138.84434: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 12613 1727096138.84474: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8bf4b470> <<< 12613 1727096138.84499: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 12613 1727096138.84547: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 12613 1727096138.84738: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8bf4acc0> <<< 12613 1727096138.84744: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 12613 1727096138.84783: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.84860: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.84952: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 12613 1727096138.84988: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.85173: stdout chunk (state=3): >>># zipimport: zlib available<<< 12613 1727096138.85293: stdout chunk (state=3): >>> <<< 12613 1727096138.85338: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 12613 1727096138.85372: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.85484: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.85601: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 12613 1727096138.85623: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.85691: stdout chunk (state=3): >>># zipimport: zlib available<<< 12613 1727096138.85709: stdout chunk (state=3): >>> <<< 12613 1727096138.85773: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 12613 1727096138.85872: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 12613 1727096138.86057: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096138.86065: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8bf8a300> <<< 12613 1727096138.86337: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8bf7a150> <<< 12613 1727096138.86354: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 12613 1727096138.86406: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.86469: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 12613 1727096138.86487: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.86576: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.86685: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.86816: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.86996: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 12613 1727096138.87016: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.87065: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 12613 1727096138.87082: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.87114: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.87163: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 12613 1727096138.87178: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 12613 1727096138.87223: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096138.87239: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8bf9dc70> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8bf881d0> import 'ansible.module_utils.facts.system.user' # <<< 12613 1727096138.87283: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 12613 1727096138.87302: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.87336: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.87388: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 12613 1727096138.87391: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.87546: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.87888: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available <<< 12613 1727096138.88030: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.88089: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.88146: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 12613 1727096138.88150: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 12613 1727096138.88192: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12613 1727096138.88226: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.88446: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.88667: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 12613 1727096138.88703: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.88883: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.89136: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 12613 1727096138.89167: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.90095: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.90770: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 12613 1727096138.90793: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.90879: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.90997: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 12613 1727096138.91000: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.91096: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.91200: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 12613 1727096138.91210: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.91373: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.91559: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 12613 1727096138.91562: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 12613 1727096138.91591: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.91676: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.91705: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 12613 1727096138.91765: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.91864: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.92075: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.92511: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 12613 1727096138.92577: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 12613 1727096138.92608: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.92635: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # <<< 12613 1727096138.92655: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.92725: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.92775: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 12613 1727096138.92794: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.92845: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.92897: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 12613 1727096138.92922: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.93180: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.93446: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 12613 1727096138.93460: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.93516: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.93582: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 12613 1727096138.93586: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.93613: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.93655: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 12613 1727096138.93665: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.93699: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.93737: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 12613 1727096138.93750: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.93773: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.93823: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 12613 1727096138.93907: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.93992: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 12613 1727096138.94018: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 12613 1727096138.94041: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.94082: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.94121: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 12613 1727096138.94172: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.94177: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.94194: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.94230: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.94285: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.94349: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.94432: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 12613 1727096138.94454: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.94500: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.94553: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 12613 1727096138.94901: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.94952: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 12613 1727096138.94977: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.95012: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.95059: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 12613 1727096138.95079: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.95121: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.95159: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 12613 1727096138.95182: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.95254: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.95484: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available <<< 12613 1727096138.95540: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 12613 1727096138.95625: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096138.96629: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 12613 1727096138.96657: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 12613 1727096138.96775: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 12613 1727096138.96788: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8bda3c20> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8bda06b0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8bda0170> <<< 12613 1727096138.97216: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "38", "epoch": "1727096138", "epoch_int": "1727096138", "date": "2024-09-23", "time": "08:55:38", "iso8601_micro": "2024-09-23T12:55:38.961073Z", "iso8601": "2024-09-23T12:55:38Z", "iso8601_basic": "20240923T085538961073", "iso8601_basic_short": "20240923T085538", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}}<<< 12613 1727096138.97240: stdout chunk (state=3): >>> <<< 12613 1727096138.97846: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 12613 1727096138.98013: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess <<< 12613 1727096138.98050: stdout chunk (state=3): >>># cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text <<< 12613 1727096138.98078: stdout chunk (state=3): >>># destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing <<< 12613 1727096138.98409: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext <<< 12613 1727096138.98418: stdout chunk (state=3): >>># destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process <<< 12613 1727096138.98433: stdout chunk (state=3): >>># cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 12613 1727096138.98617: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 12613 1727096138.98657: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 <<< 12613 1727096138.98685: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 12613 1727096138.98711: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 12613 1727096138.98784: stdout chunk (state=3): >>># destroy ntpath <<< 12613 1727096138.98828: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale <<< 12613 1727096138.98841: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 12613 1727096138.98892: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 12613 1727096138.98944: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal <<< 12613 1727096138.99010: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 12613 1727096138.99054: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 12613 1727096138.99112: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json <<< 12613 1727096138.99138: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 12613 1727096138.99231: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 12613 1727096138.99311: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 12613 1727096138.99323: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix <<< 12613 1727096138.99448: stdout chunk (state=3): >>># destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat <<< 12613 1727096138.99465: stdout chunk (state=3): >>># cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 12613 1727096138.99583: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 12613 1727096138.99638: stdout chunk (state=3): >>># destroy _collections <<< 12613 1727096138.99786: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 12613 1727096138.99790: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 12613 1727096138.99861: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 12613 1727096138.99897: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref <<< 12613 1727096138.99963: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools <<< 12613 1727096139.00088: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 12613 1727096139.00557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 12613 1727096139.00561: stdout chunk (state=3): >>><<< 12613 1727096139.00563: stderr chunk (state=3): >>><<< 12613 1727096139.00738: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cf104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cedfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cf12a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cce5130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cce5fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd23ec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd23f80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd5b830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd5bec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd3bb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd392b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd21070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd7b7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd7a3f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd3a150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd78bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdb0890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd202f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8cdb0d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdb0bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8cdb0fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cd1ee10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdb1670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdb1370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdb2540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdc8740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8cdc9e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdcacc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8cdcb2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdca210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8cdcbd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdcb4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdb24b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8cabfc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8cae87a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cae8500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8cae87d0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8cae9100> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8cae9af0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cae89b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cabddf0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8caeaf00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cae9c40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cdb2c60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cb13230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cb375f0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cb98380> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cb9aae0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cb984a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cb59370> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c9a5430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8cb363f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8caebe00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f1d8cb36750> # zipimport: found 103 names in '/tmp/ansible_setup_payload_q_v8meid/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8ca0f170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c9ee060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c9ed1c0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8ca0d010> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8ca3ea20> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8ca3e7b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8ca3e0f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8ca3e540> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8ca0fb90> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8ca3f7d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8ca3fa10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8ca3ff50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c331d00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c333920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c3342f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c335490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c337f80> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c33c2f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c336240> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c33fef0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c33e9c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c33e720> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c33ec90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c336750> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c383f50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c384230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c385cd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c385a90> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c388260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c386390> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c38b9b0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c3883b0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c38c7a0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c38c9b0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c38ccb0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c384350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c218380> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c219430> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c38eb10> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c38fec0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c38e750> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c21d760> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c21e450> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c219580> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c21e600> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c21f890> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8c22a270> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c225ac0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c302ab0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c3fe780> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c22a2d0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c21f110> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c2ba4b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8befc290> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8befc5f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c2a4230> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c2bafc0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c2b8bf0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c2b87d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8beff560> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8befee10> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8befef90> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8befe240> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8beff680> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8bf4a180> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8bf481a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8c2b8800> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8bf4b470> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8bf4acc0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8bf8a300> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8bf7a150> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8bf9dc70> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8bf881d0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1d8bda3c20> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8bda06b0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1d8bda0170> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "38", "epoch": "1727096138", "epoch_int": "1727096138", "date": "2024-09-23", "time": "08:55:38", "iso8601_micro": "2024-09-23T12:55:38.961073Z", "iso8601": "2024-09-23T12:55:38Z", "iso8601_basic": "20240923T085538961073", "iso8601_basic_short": "20240923T085538", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 12613 1727096139.02805: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096138.3272693-12717-197936071283665/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12613 1727096139.02809: _low_level_execute_command(): starting 12613 1727096139.02812: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096138.3272693-12717-197936071283665/ > /dev/null 2>&1 && sleep 0' 12613 1727096139.02814: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12613 1727096139.02988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 12613 1727096139.02992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12613 1727096139.03128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12613 1727096139.05139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12613 1727096139.05143: stdout chunk (state=3): >>><<< 12613 1727096139.05174: stderr chunk (state=3): >>><<< 12613 1727096139.05178: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12613 1727096139.05180: handler run complete 12613 1727096139.05222: variable 'ansible_facts' from source: unknown 12613 1727096139.05276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096139.05619: variable 'ansible_facts' from source: unknown 12613 1727096139.05659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096139.05712: attempt loop complete, returning result 12613 1727096139.05716: _execute() done 12613 1727096139.05718: dumping result to json 12613 1727096139.05972: done dumping result, returning 12613 1727096139.05976: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0afff68d-5257-a9dd-d073-0000000001cd] 12613 1727096139.05978: sending task result for task 0afff68d-5257-a9dd-d073-0000000001cd 12613 1727096139.06264: done sending task result for task 0afff68d-5257-a9dd-d073-0000000001cd 12613 1727096139.06270: WORKER PROCESS EXITING ok: [managed_node1] 12613 1727096139.06386: no more pending results, returning what we have 12613 1727096139.06388: results queue empty 12613 1727096139.06389: checking for any_errors_fatal 12613 1727096139.06391: done checking for any_errors_fatal 12613 1727096139.06391: checking for max_fail_percentage 12613 1727096139.06393: done checking for max_fail_percentage 12613 1727096139.06394: checking to see if all hosts have failed and the running result is not ok 12613 1727096139.06394: done checking to see if all hosts have failed 12613 1727096139.06395: getting the remaining hosts for this loop 12613 1727096139.06396: done getting the remaining hosts for this loop 12613 1727096139.06400: getting the next task for host managed_node1 12613 1727096139.06410: done getting next task for host managed_node1 12613 1727096139.06413: ^ task is: TASK: Check if system is ostree 12613 1727096139.06415: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096139.06418: getting variables 12613 1727096139.06420: in VariableManager get_vars() 12613 1727096139.06449: Calling all_inventory to load vars for managed_node1 12613 1727096139.06454: Calling groups_inventory to load vars for managed_node1 12613 1727096139.06457: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096139.06672: Calling all_plugins_play to load vars for managed_node1 12613 1727096139.06676: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096139.06680: Calling groups_plugins_play to load vars for managed_node1 12613 1727096139.07014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096139.07207: done with get_vars() 12613 1727096139.07219: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Monday 23 September 2024 08:55:39 -0400 (0:00:00.828) 0:00:02.710 ****** 12613 1727096139.07316: entering _queue_task() for managed_node1/stat 12613 1727096139.08341: worker is 1 (out of 1 available) 12613 1727096139.08354: exiting _queue_task() for managed_node1/stat 12613 1727096139.08365: done queuing things up, now waiting for results queue to drain 12613 1727096139.08770: waiting for pending results... 12613 1727096139.09026: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 12613 1727096139.09258: in run() - task 0afff68d-5257-a9dd-d073-0000000001cf 12613 1727096139.09281: variable 'ansible_search_path' from source: unknown 12613 1727096139.09288: variable 'ansible_search_path' from source: unknown 12613 1727096139.09544: calling self._execute() 12613 1727096139.09548: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096139.09550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096139.09553: variable 'omit' from source: magic vars 12613 1727096139.10349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12613 1727096139.10855: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12613 1727096139.11041: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12613 1727096139.11110: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12613 1727096139.11273: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12613 1727096139.11347: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12613 1727096139.11452: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12613 1727096139.11485: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096139.11560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12613 1727096139.11799: Evaluated conditional (not __network_is_ostree is defined): True 12613 1727096139.11866: variable 'omit' from source: magic vars 12613 1727096139.11911: variable 'omit' from source: magic vars 12613 1727096139.12006: variable 'omit' from source: magic vars 12613 1727096139.12037: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12613 1727096139.12104: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12613 1727096139.12202: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12613 1727096139.12252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12613 1727096139.12301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12613 1727096139.12473: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12613 1727096139.12477: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096139.12479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096139.12543: Set connection var ansible_connection to ssh 12613 1727096139.12625: Set connection var ansible_module_compression to ZIP_DEFLATED 12613 1727096139.12640: Set connection var ansible_timeout to 10 12613 1727096139.12651: Set connection var ansible_shell_type to sh 12613 1727096139.12660: Set connection var ansible_pipelining to False 12613 1727096139.12831: Set connection var ansible_shell_executable to /bin/sh 12613 1727096139.12834: variable 'ansible_shell_executable' from source: unknown 12613 1727096139.12836: variable 'ansible_connection' from source: unknown 12613 1727096139.12838: variable 'ansible_module_compression' from source: unknown 12613 1727096139.12840: variable 'ansible_shell_type' from source: unknown 12613 1727096139.12842: variable 'ansible_shell_executable' from source: unknown 12613 1727096139.12844: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096139.12846: variable 'ansible_pipelining' from source: unknown 12613 1727096139.12847: variable 'ansible_timeout' from source: unknown 12613 1727096139.12849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096139.13115: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 12613 1727096139.13131: variable 'omit' from source: magic vars 12613 1727096139.13162: starting attempt loop 12613 1727096139.13171: running the handler 12613 1727096139.13473: _low_level_execute_command(): starting 12613 1727096139.13477: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12613 1727096139.14790: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 12613 1727096139.15113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12613 1727096139.15295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12613 1727096139.17279: stdout chunk (state=3): >>>/root <<< 12613 1727096139.17284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12613 1727096139.17286: stdout chunk (state=3): >>><<< 12613 1727096139.17288: stderr chunk (state=3): >>><<< 12613 1727096139.17373: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12613 1727096139.17406: _low_level_execute_command(): starting 12613 1727096139.17422: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096139.1738908-12765-89211687004068 `" && echo ansible-tmp-1727096139.1738908-12765-89211687004068="` echo /root/.ansible/tmp/ansible-tmp-1727096139.1738908-12765-89211687004068 `" ) && sleep 0' 12613 1727096139.18675: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12613 1727096139.18691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12613 1727096139.18818: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 12613 1727096139.19056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12613 1727096139.19266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 12613 1727096139.22102: stdout chunk (state=3): >>>ansible-tmp-1727096139.1738908-12765-89211687004068=/root/.ansible/tmp/ansible-tmp-1727096139.1738908-12765-89211687004068 <<< 12613 1727096139.22183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12613 1727096139.22210: stdout chunk (state=3): >>><<< 12613 1727096139.22223: stderr chunk (state=3): >>><<< 12613 1727096139.22476: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096139.1738908-12765-89211687004068=/root/.ansible/tmp/ansible-tmp-1727096139.1738908-12765-89211687004068 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 12613 1727096139.22480: variable 'ansible_module_compression' from source: unknown 12613 1727096139.22482: ANSIBALLZ: Using lock for stat 12613 1727096139.22484: ANSIBALLZ: Acquiring lock 12613 1727096139.22486: ANSIBALLZ: Lock acquired: 140022598898208 12613 1727096139.22488: ANSIBALLZ: Creating module 12613 1727096139.42067: ANSIBALLZ: Writing module into payload 12613 1727096139.42187: ANSIBALLZ: Writing module 12613 1727096139.42232: ANSIBALLZ: Renaming module 12613 1727096139.42247: ANSIBALLZ: Done creating module 12613 1727096139.42273: variable 'ansible_facts' from source: unknown 12613 1727096139.42349: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096139.1738908-12765-89211687004068/AnsiballZ_stat.py 12613 1727096139.42573: Sending initial data 12613 1727096139.42581: Sent initial data (152 bytes) 12613 1727096139.43181: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12613 1727096139.43329: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12613 1727096139.43334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 12613 1727096139.43355: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12613 1727096139.43454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 12613 1727096139.45937: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12613 1727096139.46013: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12613 1727096139.46107: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12613tatu8w7b/tmphutkdnj7 /root/.ansible/tmp/ansible-tmp-1727096139.1738908-12765-89211687004068/AnsiballZ_stat.py <<< 12613 1727096139.46136: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096139.1738908-12765-89211687004068/AnsiballZ_stat.py" <<< 12613 1727096139.46207: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-12613tatu8w7b/tmphutkdnj7" to remote "/root/.ansible/tmp/ansible-tmp-1727096139.1738908-12765-89211687004068/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096139.1738908-12765-89211687004068/AnsiballZ_stat.py" <<< 12613 1727096139.47319: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12613 1727096139.47402: stderr chunk (state=3): >>><<< 12613 1727096139.47406: stdout chunk (state=3): >>><<< 12613 1727096139.47419: done transferring module to remote 12613 1727096139.47446: _low_level_execute_command(): starting 12613 1727096139.47460: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096139.1738908-12765-89211687004068/ /root/.ansible/tmp/ansible-tmp-1727096139.1738908-12765-89211687004068/AnsiballZ_stat.py && sleep 0' 12613 1727096139.48190: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12613 1727096139.48272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 12613 1727096139.48342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12613 1727096139.48520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12613 1727096139.51083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12613 1727096139.51102: stderr chunk (state=3): >>><<< 12613 1727096139.51128: stdout chunk (state=3): >>><<< 12613 1727096139.51164: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12613 1727096139.51170: _low_level_execute_command(): starting 12613 1727096139.51173: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096139.1738908-12765-89211687004068/AnsiballZ_stat.py && sleep 0' 12613 1727096139.51824: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12613 1727096139.51829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 12613 1727096139.51844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12613 1727096139.51872: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12613 1727096139.51875: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12613 1727096139.51878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12613 1727096139.51981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12613 1727096139.52218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12613 1727096139.54621: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 12613 1727096139.54664: stdout chunk (state=3): >>>import _imp # builtin<<< 12613 1727096139.54669: stdout chunk (state=3): >>> <<< 12613 1727096139.54705: stdout chunk (state=3): >>>import '_thread' # <<< 12613 1727096139.54709: stdout chunk (state=3): >>> <<< 12613 1727096139.54733: stdout chunk (state=3): >>>import '_warnings' # <<< 12613 1727096139.54757: stdout chunk (state=3): >>>import '_weakref' # <<< 12613 1727096139.54848: stdout chunk (state=3): >>>import '_io' # <<< 12613 1727096139.54862: stdout chunk (state=3): >>> <<< 12613 1727096139.54869: stdout chunk (state=3): >>>import 'marshal' # <<< 12613 1727096139.54922: stdout chunk (state=3): >>>import 'posix' # <<< 12613 1727096139.54975: stdout chunk (state=3): >>> import '_frozen_importlib_external' # <<< 12613 1727096139.54992: stdout chunk (state=3): >>># installing zipimport hook <<< 12613 1727096139.55025: stdout chunk (state=3): >>>import 'time' # <<< 12613 1727096139.55048: stdout chunk (state=3): >>> import 'zipimport' # <<< 12613 1727096139.55063: stdout chunk (state=3): >>> # installed zipimport hook<<< 12613 1727096139.55131: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py<<< 12613 1727096139.55137: stdout chunk (state=3): >>> <<< 12613 1727096139.55140: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc'<<< 12613 1727096139.55157: stdout chunk (state=3): >>> <<< 12613 1727096139.55189: stdout chunk (state=3): >>>import '_codecs' # <<< 12613 1727096139.55192: stdout chunk (state=3): >>> <<< 12613 1727096139.55228: stdout chunk (state=3): >>>import 'codecs' # <<< 12613 1727096139.55233: stdout chunk (state=3): >>> <<< 12613 1727096139.55284: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 12613 1727096139.55329: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c4e84d0><<< 12613 1727096139.55342: stdout chunk (state=3): >>> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c4b7b30><<< 12613 1727096139.55380: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py<<< 12613 1727096139.55394: stdout chunk (state=3): >>> <<< 12613 1727096139.55407: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc'<<< 12613 1727096139.55410: stdout chunk (state=3): >>> import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c4eaa50><<< 12613 1727096139.55446: stdout chunk (state=3): >>> import '_signal' # <<< 12613 1727096139.55487: stdout chunk (state=3): >>>import '_abc' # <<< 12613 1727096139.55494: stdout chunk (state=3): >>> <<< 12613 1727096139.55532: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 12613 1727096139.55581: stdout chunk (state=3): >>>import '_stat' # <<< 12613 1727096139.55682: stdout chunk (state=3): >>> import 'stat' # <<< 12613 1727096139.55730: stdout chunk (state=3): >>>import '_collections_abc' # <<< 12613 1727096139.55735: stdout chunk (state=3): >>> <<< 12613 1727096139.55776: stdout chunk (state=3): >>>import 'genericpath' # <<< 12613 1727096139.55782: stdout chunk (state=3): >>> <<< 12613 1727096139.55796: stdout chunk (state=3): >>>import 'posixpath' # <<< 12613 1727096139.55843: stdout chunk (state=3): >>>import 'os' # <<< 12613 1727096139.55849: stdout chunk (state=3): >>> <<< 12613 1727096139.55874: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 12613 1727096139.55879: stdout chunk (state=3): >>> <<< 12613 1727096139.55904: stdout chunk (state=3): >>>Processing user site-packages<<< 12613 1727096139.55907: stdout chunk (state=3): >>> <<< 12613 1727096139.55929: stdout chunk (state=3): >>>Processing global site-packages <<< 12613 1727096139.55950: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages'<<< 12613 1727096139.55975: stdout chunk (state=3): >>> Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 12613 1727096139.56023: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc'<<< 12613 1727096139.56057: stdout chunk (state=3): >>> import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c299130><<< 12613 1727096139.56060: stdout chunk (state=3): >>> <<< 12613 1727096139.56133: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py<<< 12613 1727096139.56162: stdout chunk (state=3): >>> # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc'<<< 12613 1727096139.56171: stdout chunk (state=3): >>> <<< 12613 1727096139.56186: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c299fa0><<< 12613 1727096139.56224: stdout chunk (state=3): >>> import 'site' # <<< 12613 1727096139.56260: stdout chunk (state=3): >>> <<< 12613 1727096139.56303: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information.<<< 12613 1727096139.56402: stdout chunk (state=3): >>> <<< 12613 1727096139.56672: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 12613 1727096139.56726: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 12613 1727096139.56771: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 12613 1727096139.56774: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc'<<< 12613 1727096139.56781: stdout chunk (state=3): >>> <<< 12613 1727096139.56807: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 12613 1727096139.56874: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 12613 1727096139.56978: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c2d7e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 12613 1727096139.56985: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 12613 1727096139.57045: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c2d7f50> <<< 12613 1727096139.57088: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 12613 1727096139.57294: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c30f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c30ff20> import '_collections' # <<< 12613 1727096139.57315: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c2efb60> import '_functools' # <<< 12613 1727096139.57328: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c2ed280> <<< 12613 1727096139.57423: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c2d5040> <<< 12613 1727096139.57440: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 12613 1727096139.57462: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 12613 1727096139.57483: stdout chunk (state=3): >>>import '_sre' # <<< 12613 1727096139.57700: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c32f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c32e420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 12613 1727096139.57709: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c2ee150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c32cc80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c364890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c2d42c0> <<< 12613 1727096139.57738: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 12613 1727096139.57754: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964c364d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c364bf0> <<< 12613 1727096139.57786: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096139.57798: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964c364fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c2d2de0> <<< 12613 1727096139.57825: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 12613 1727096139.57891: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c3656d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c3653a0> <<< 12613 1727096139.57991: stdout chunk (state=3): >>>import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c3665d0> import 'importlib.util' # import 'runpy' # <<< 12613 1727096139.58085: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c37c7a0> <<< 12613 1727096139.58092: stdout chunk (state=3): >>>import 'errno' # <<< 12613 1727096139.58110: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964c37deb0> <<< 12613 1727096139.58186: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 12613 1727096139.58189: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c37ed50> <<< 12613 1727096139.58321: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964c37f380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c37e2a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 12613 1727096139.58324: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964c37fe00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c37f530> <<< 12613 1727096139.58387: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c366570> <<< 12613 1727096139.58392: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 12613 1727096139.58432: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 12613 1727096139.58436: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 12613 1727096139.58450: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 12613 1727096139.58482: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096139.58491: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964c107ce0> <<< 12613 1727096139.58504: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 12613 1727096139.58534: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096139.58567: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964c130740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c1304a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096139.58575: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964c130770> <<< 12613 1727096139.58603: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 12613 1727096139.58648: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 12613 1727096139.58666: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096139.58788: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964c1310a0> <<< 12613 1727096139.59024: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964c131a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c130950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c105e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 12613 1727096139.59077: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c132e10> <<< 12613 1727096139.59104: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c1318e0> <<< 12613 1727096139.59120: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c366cc0> <<< 12613 1727096139.59233: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 12613 1727096139.59253: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 12613 1727096139.59298: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 12613 1727096139.59328: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c15b170> <<< 12613 1727096139.59401: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 12613 1727096139.59415: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 12613 1727096139.59437: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 12613 1727096139.59458: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 12613 1727096139.59540: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c17f4d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 12613 1727096139.59596: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 12613 1727096139.59700: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c1e02f0> <<< 12613 1727096139.59722: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 12613 1727096139.59785: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 12613 1727096139.59839: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 12613 1727096139.59964: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c1e2a20> <<< 12613 1727096139.60094: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c1e03e0> <<< 12613 1727096139.60146: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c1a52e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bb253d0> <<< 12613 1727096139.60174: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c17e300> <<< 12613 1727096139.60185: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c133d40> <<< 12613 1727096139.60371: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f964c17e660> <<< 12613 1727096139.60620: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_8aa3wzx3/ansible_stat_payload.zip' # zipimport: zlib available<<< 12613 1727096139.60691: stdout chunk (state=3): >>> <<< 12613 1727096139.60876: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.60935: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 12613 1727096139.60973: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 12613 1727096139.61018: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py<<< 12613 1727096139.61091: stdout chunk (state=3): >>> <<< 12613 1727096139.61161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc'<<< 12613 1727096139.61166: stdout chunk (state=3): >>> <<< 12613 1727096139.61219: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py<<< 12613 1727096139.61231: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc'<<< 12613 1727096139.61237: stdout chunk (state=3): >>> <<< 12613 1727096139.61264: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bb7b0e0> import '_typing' # <<< 12613 1727096139.61390: stdout chunk (state=3): >>> <<< 12613 1727096139.61579: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bb59fd0><<< 12613 1727096139.61588: stdout chunk (state=3): >>> <<< 12613 1727096139.61611: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bb59160><<< 12613 1727096139.61618: stdout chunk (state=3): >>> <<< 12613 1727096139.61637: stdout chunk (state=3): >>># zipimport: zlib available<<< 12613 1727096139.61640: stdout chunk (state=3): >>> <<< 12613 1727096139.61688: stdout chunk (state=3): >>>import 'ansible' # <<< 12613 1727096139.61696: stdout chunk (state=3): >>> <<< 12613 1727096139.61718: stdout chunk (state=3): >>># zipimport: zlib available<<< 12613 1727096139.61726: stdout chunk (state=3): >>> <<< 12613 1727096139.61758: stdout chunk (state=3): >>># zipimport: zlib available<<< 12613 1727096139.61763: stdout chunk (state=3): >>> <<< 12613 1727096139.61805: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 12613 1727096139.61842: stdout chunk (state=3): >>> # zipimport: zlib available <<< 12613 1727096139.63445: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.64623: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bb79760> <<< 12613 1727096139.64656: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 12613 1727096139.64686: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 12613 1727096139.64729: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 12613 1727096139.64732: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964bba2ab0> <<< 12613 1727096139.64771: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bba2870> <<< 12613 1727096139.64824: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bba2180> <<< 12613 1727096139.64836: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 12613 1727096139.64889: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bba2660> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c4ea9c0> <<< 12613 1727096139.64892: stdout chunk (state=3): >>>import 'atexit' # <<< 12613 1727096139.64942: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964bba3830> <<< 12613 1727096139.64960: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964bba39e0> <<< 12613 1727096139.64980: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 12613 1727096139.65046: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 12613 1727096139.65096: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bba3f20> import 'pwd' # <<< 12613 1727096139.65121: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 12613 1727096139.65153: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 12613 1727096139.65189: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba0dbb0> <<< 12613 1727096139.65223: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096139.65242: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964ba0f860> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 12613 1727096139.65287: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 12613 1727096139.65298: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba10260> <<< 12613 1727096139.65321: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 12613 1727096139.65349: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 12613 1727096139.65382: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba11130> <<< 12613 1727096139.65404: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 12613 1727096139.65445: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 12613 1727096139.65457: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 12613 1727096139.65521: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba13e90> <<< 12613 1727096139.65557: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964bb5b0b0> <<< 12613 1727096139.65587: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba12150> <<< 12613 1727096139.65602: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 12613 1727096139.65643: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 12613 1727096139.65657: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 12613 1727096139.65683: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 12613 1727096139.65707: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 12613 1727096139.65761: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 12613 1727096139.65783: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba1bd10> import '_tokenize' # <<< 12613 1727096139.65856: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba1a7e0> <<< 12613 1727096139.65884: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba1a540> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 12613 1727096139.65942: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba1aa50> <<< 12613 1727096139.65987: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba12660> <<< 12613 1727096139.66025: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964ba63ec0> <<< 12613 1727096139.66043: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba63fb0> <<< 12613 1727096139.66069: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 12613 1727096139.66111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 12613 1727096139.66152: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 12613 1727096139.66155: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964ba65ac0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba65880> <<< 12613 1727096139.66177: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 12613 1727096139.66312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 12613 1727096139.66351: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964ba67fb0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba66180> <<< 12613 1727096139.66378: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 12613 1727096139.66438: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 12613 1727096139.66461: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 12613 1727096139.66474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 12613 1727096139.66517: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba6b710> <<< 12613 1727096139.66649: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba680e0> <<< 12613 1727096139.66709: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964ba6c4d0> <<< 12613 1727096139.66749: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964ba6c500> <<< 12613 1727096139.66788: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964ba6ca10> <<< 12613 1727096139.66816: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba641a0> <<< 12613 1727096139.66840: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 12613 1727096139.66872: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 12613 1727096139.66886: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 12613 1727096139.66905: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096139.66930: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964baf81d0> <<< 12613 1727096139.67101: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964baf94c0> <<< 12613 1727096139.67149: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba6e960> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 12613 1727096139.67186: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964ba6fd10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba6e570> # zipimport: zlib available <<< 12613 1727096139.67223: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 12613 1727096139.67236: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.67306: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.67429: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.67435: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 12613 1727096139.67464: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 12613 1727096139.67594: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.67707: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.68272: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.68839: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 12613 1727096139.68871: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 12613 1727096139.68893: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 12613 1727096139.68943: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964bafd670> <<< 12613 1727096139.69031: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 12613 1727096139.69059: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bafe450> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba6d820> <<< 12613 1727096139.69127: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 12613 1727096139.69131: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.69171: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 12613 1727096139.69175: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.69324: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.69478: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 12613 1727096139.69506: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bafe4e0> # zipimport: zlib available <<< 12613 1727096139.69981: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.70430: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.70506: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.70589: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 12613 1727096139.70593: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.70621: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.70670: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 12613 1727096139.70674: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.70744: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.70860: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 12613 1727096139.70864: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.70883: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 12613 1727096139.70909: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.70962: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 12613 1727096139.71195: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.71433: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 12613 1727096139.71511: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 12613 1727096139.71514: stdout chunk (state=3): >>>import '_ast' # <<< 12613 1727096139.71588: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964baff5f0> <<< 12613 1727096139.71591: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.71671: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.71747: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 12613 1727096139.71781: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 12613 1727096139.71793: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.71831: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.71876: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 12613 1727096139.71879: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.71918: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.71971: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.72022: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.72093: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 12613 1727096139.72129: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 12613 1727096139.72234: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964b90a120> <<< 12613 1727096139.72270: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964b907e60> <<< 12613 1727096139.72301: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 12613 1727096139.72321: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.72384: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.72448: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.72476: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.72516: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 12613 1727096139.72543: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 12613 1727096139.72576: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 12613 1727096139.72590: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 12613 1727096139.72678: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 12613 1727096139.72681: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 12613 1727096139.72702: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 12613 1727096139.72742: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bbfe960> <<< 12613 1727096139.72805: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bbea630> <<< 12613 1727096139.72891: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964b90a0f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bb59f40> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 12613 1727096139.72925: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.72949: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.72962: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 12613 1727096139.73041: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 12613 1727096139.73059: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 12613 1727096139.73077: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.73211: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.73406: stdout chunk (state=3): >>># zipimport: zlib available <<< 12613 1727096139.73527: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 12613 1727096139.73563: stdout chunk (state=3): >>># destroy __main__ <<< 12613 1727096139.73943: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value <<< 12613 1727096139.73962: stdout chunk (state=3): >>># clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib <<< 12613 1727096139.74021: stdout chunk (state=3): >>># cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy <<< 12613 1727096139.74046: stdout chunk (state=3): >>># cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128<<< 12613 1727096139.74129: stdout chunk (state=3): >>> # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast <<< 12613 1727096139.74150: stdout chunk (state=3): >>># destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file<<< 12613 1727096139.74154: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 12613 1727096139.74414: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 12613 1727096139.74441: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 <<< 12613 1727096139.74510: stdout chunk (state=3): >>># destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 12613 1727096139.74517: stdout chunk (state=3): >>># destroy ntpath <<< 12613 1727096139.74537: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json <<< 12613 1727096139.74592: stdout chunk (state=3): >>># destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess <<< 12613 1727096139.74630: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array <<< 12613 1727096139.74653: stdout chunk (state=3): >>># destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 12613 1727096139.74723: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves <<< 12613 1727096139.74727: stdout chunk (state=3): >>># cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader <<< 12613 1727096139.74792: stdout chunk (state=3): >>># cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 12613 1727096139.74839: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 12613 1727096139.74872: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat <<< 12613 1727096139.74912: stdout chunk (state=3): >>># destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 12613 1727096139.75066: stdout chunk (state=3): >>># destroy sys.monitoring <<< 12613 1727096139.75116: stdout chunk (state=3): >>># destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath <<< 12613 1727096139.75155: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize <<< 12613 1727096139.75190: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize <<< 12613 1727096139.75212: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 12613 1727096139.75313: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 12613 1727096139.75542: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 12613 1727096139.75637: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 12613 1727096139.76347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 12613 1727096139.76351: stdout chunk (state=3): >>><<< 12613 1727096139.76353: stderr chunk (state=3): >>><<< 12613 1727096139.76378: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c4e84d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c4b7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c4eaa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c299130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c299fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c2d7e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c2d7f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c30f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c30ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c2efb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c2ed280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c2d5040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c32f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c32e420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c2ee150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c32cc80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c364890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c2d42c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964c364d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c364bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964c364fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c2d2de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c3656d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c3653a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c3665d0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c37c7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964c37deb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c37ed50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964c37f380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c37e2a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964c37fe00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c37f530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c366570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964c107ce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964c130740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c1304a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964c130770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964c1310a0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964c131a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c130950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c105e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c132e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c1318e0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c366cc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c15b170> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c17f4d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c1e02f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c1e2a20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c1e03e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c1a52e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bb253d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c17e300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c133d40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f964c17e660> # zipimport: found 30 names in '/tmp/ansible_stat_payload_8aa3wzx3/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bb7b0e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bb59fd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bb59160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bb79760> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964bba2ab0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bba2870> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bba2180> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bba2660> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964c4ea9c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964bba3830> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964bba39e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bba3f20> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba0dbb0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964ba0f860> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba10260> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba11130> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba13e90> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964bb5b0b0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba12150> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba1bd10> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba1a7e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba1a540> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba1aa50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba12660> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964ba63ec0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba63fb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964ba65ac0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba65880> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964ba67fb0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba66180> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba6b710> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba680e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964ba6c4d0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964ba6c500> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964ba6ca10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba641a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964baf81d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964baf94c0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba6e960> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964ba6fd10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba6e570> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964bafd670> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bafe450> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964ba6d820> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bafe4e0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964baff5f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f964b90a120> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964b907e60> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bbfe960> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bbea630> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964b90a0f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f964bb59f40> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 12613 1727096139.78354: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096139.1738908-12765-89211687004068/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12613 1727096139.78358: _low_level_execute_command(): starting 12613 1727096139.78362: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096139.1738908-12765-89211687004068/ > /dev/null 2>&1 && sleep 0' 12613 1727096139.78382: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12613 1727096139.78401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12613 1727096139.78416: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 12613 1727096139.78427: stderr chunk (state=3): >>>debug2: match found <<< 12613 1727096139.78457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12613 1727096139.78530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 12613 1727096139.78574: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12613 1727096139.78600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12613 1727096139.78698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 12613 1727096139.81075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12613 1727096139.81475: stdout chunk (state=3): >>><<< 12613 1727096139.81479: stderr chunk (state=3): >>><<< 12613 1727096139.81482: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 12613 1727096139.81484: handler run complete 12613 1727096139.81486: attempt loop complete, returning result 12613 1727096139.81488: _execute() done 12613 1727096139.81490: dumping result to json 12613 1727096139.81492: done dumping result, returning 12613 1727096139.81494: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [0afff68d-5257-a9dd-d073-0000000001cf] 12613 1727096139.81496: sending task result for task 0afff68d-5257-a9dd-d073-0000000001cf 12613 1727096139.81566: done sending task result for task 0afff68d-5257-a9dd-d073-0000000001cf 12613 1727096139.81571: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 12613 1727096139.81636: no more pending results, returning what we have 12613 1727096139.81639: results queue empty 12613 1727096139.81640: checking for any_errors_fatal 12613 1727096139.81646: done checking for any_errors_fatal 12613 1727096139.81647: checking for max_fail_percentage 12613 1727096139.81650: done checking for max_fail_percentage 12613 1727096139.81651: checking to see if all hosts have failed and the running result is not ok 12613 1727096139.81652: done checking to see if all hosts have failed 12613 1727096139.81652: getting the remaining hosts for this loop 12613 1727096139.81654: done getting the remaining hosts for this loop 12613 1727096139.81657: getting the next task for host managed_node1 12613 1727096139.81663: done getting next task for host managed_node1 12613 1727096139.81665: ^ task is: TASK: Set flag to indicate system is ostree 12613 1727096139.81669: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096139.81673: getting variables 12613 1727096139.81675: in VariableManager get_vars() 12613 1727096139.81707: Calling all_inventory to load vars for managed_node1 12613 1727096139.81710: Calling groups_inventory to load vars for managed_node1 12613 1727096139.81713: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096139.81728: Calling all_plugins_play to load vars for managed_node1 12613 1727096139.81731: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096139.81734: Calling groups_plugins_play to load vars for managed_node1 12613 1727096139.82243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096139.82443: done with get_vars() 12613 1727096139.82457: done getting variables 12613 1727096139.82585: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Monday 23 September 2024 08:55:39 -0400 (0:00:00.753) 0:00:03.463 ****** 12613 1727096139.82640: entering _queue_task() for managed_node1/set_fact 12613 1727096139.82642: Creating lock for set_fact 12613 1727096139.83107: worker is 1 (out of 1 available) 12613 1727096139.83117: exiting _queue_task() for managed_node1/set_fact 12613 1727096139.83127: done queuing things up, now waiting for results queue to drain 12613 1727096139.83128: waiting for pending results... 12613 1727096139.83414: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 12613 1727096139.83661: in run() - task 0afff68d-5257-a9dd-d073-0000000001d0 12613 1727096139.83706: variable 'ansible_search_path' from source: unknown 12613 1727096139.83795: variable 'ansible_search_path' from source: unknown 12613 1727096139.83799: calling self._execute() 12613 1727096139.83836: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096139.83853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096139.83904: variable 'omit' from source: magic vars 12613 1727096139.84320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12613 1727096139.84597: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12613 1727096139.84646: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12613 1727096139.84770: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12613 1727096139.84774: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12613 1727096139.84812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12613 1727096139.84843: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12613 1727096139.84881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096139.84911: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12613 1727096139.85035: Evaluated conditional (not __network_is_ostree is defined): True 12613 1727096139.85046: variable 'omit' from source: magic vars 12613 1727096139.85094: variable 'omit' from source: magic vars 12613 1727096139.85202: variable '__ostree_booted_stat' from source: set_fact 12613 1727096139.85250: variable 'omit' from source: magic vars 12613 1727096139.85281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12613 1727096139.85312: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12613 1727096139.85336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12613 1727096139.85413: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12613 1727096139.85417: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12613 1727096139.85420: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12613 1727096139.85422: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096139.85428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096139.85532: Set connection var ansible_connection to ssh 12613 1727096139.85635: Set connection var ansible_module_compression to ZIP_DEFLATED 12613 1727096139.85650: Set connection var ansible_timeout to 10 12613 1727096139.85665: Set connection var ansible_shell_type to sh 12613 1727096139.85677: Set connection var ansible_pipelining to False 12613 1727096139.85744: Set connection var ansible_shell_executable to /bin/sh 12613 1727096139.85747: variable 'ansible_shell_executable' from source: unknown 12613 1727096139.85750: variable 'ansible_connection' from source: unknown 12613 1727096139.85754: variable 'ansible_module_compression' from source: unknown 12613 1727096139.85756: variable 'ansible_shell_type' from source: unknown 12613 1727096139.85758: variable 'ansible_shell_executable' from source: unknown 12613 1727096139.85760: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096139.85763: variable 'ansible_pipelining' from source: unknown 12613 1727096139.85765: variable 'ansible_timeout' from source: unknown 12613 1727096139.85769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096139.85866: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12613 1727096139.85886: variable 'omit' from source: magic vars 12613 1727096139.85896: starting attempt loop 12613 1727096139.85903: running the handler 12613 1727096139.85918: handler run complete 12613 1727096139.85961: attempt loop complete, returning result 12613 1727096139.85964: _execute() done 12613 1727096139.85967: dumping result to json 12613 1727096139.85971: done dumping result, returning 12613 1727096139.85973: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [0afff68d-5257-a9dd-d073-0000000001d0] 12613 1727096139.85976: sending task result for task 0afff68d-5257-a9dd-d073-0000000001d0 12613 1727096139.86133: done sending task result for task 0afff68d-5257-a9dd-d073-0000000001d0 12613 1727096139.86136: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 12613 1727096139.86222: no more pending results, returning what we have 12613 1727096139.86224: results queue empty 12613 1727096139.86226: checking for any_errors_fatal 12613 1727096139.86232: done checking for any_errors_fatal 12613 1727096139.86233: checking for max_fail_percentage 12613 1727096139.86235: done checking for max_fail_percentage 12613 1727096139.86236: checking to see if all hosts have failed and the running result is not ok 12613 1727096139.86236: done checking to see if all hosts have failed 12613 1727096139.86237: getting the remaining hosts for this loop 12613 1727096139.86238: done getting the remaining hosts for this loop 12613 1727096139.86242: getting the next task for host managed_node1 12613 1727096139.86254: done getting next task for host managed_node1 12613 1727096139.86257: ^ task is: TASK: Fix CentOS6 Base repo 12613 1727096139.86259: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096139.86264: getting variables 12613 1727096139.86266: in VariableManager get_vars() 12613 1727096139.86298: Calling all_inventory to load vars for managed_node1 12613 1727096139.86301: Calling groups_inventory to load vars for managed_node1 12613 1727096139.86305: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096139.86317: Calling all_plugins_play to load vars for managed_node1 12613 1727096139.86320: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096139.86330: Calling groups_plugins_play to load vars for managed_node1 12613 1727096139.86840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096139.87028: done with get_vars() 12613 1727096139.87038: done getting variables 12613 1727096139.87225: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Monday 23 September 2024 08:55:39 -0400 (0:00:00.046) 0:00:03.509 ****** 12613 1727096139.87255: entering _queue_task() for managed_node1/copy 12613 1727096139.87582: worker is 1 (out of 1 available) 12613 1727096139.87593: exiting _queue_task() for managed_node1/copy 12613 1727096139.87612: done queuing things up, now waiting for results queue to drain 12613 1727096139.87613: waiting for pending results... 12613 1727096139.87888: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 12613 1727096139.88093: in run() - task 0afff68d-5257-a9dd-d073-0000000001d2 12613 1727096139.88097: variable 'ansible_search_path' from source: unknown 12613 1727096139.88100: variable 'ansible_search_path' from source: unknown 12613 1727096139.88202: calling self._execute() 12613 1727096139.88254: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096139.88318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096139.88331: variable 'omit' from source: magic vars 12613 1727096139.88745: variable 'ansible_distribution' from source: facts 12613 1727096139.88792: Evaluated conditional (ansible_distribution == 'CentOS'): True 12613 1727096139.88904: variable 'ansible_distribution_major_version' from source: facts 12613 1727096139.88962: Evaluated conditional (ansible_distribution_major_version == '6'): False 12613 1727096139.88965: when evaluation is False, skipping this task 12613 1727096139.88969: _execute() done 12613 1727096139.88972: dumping result to json 12613 1727096139.88974: done dumping result, returning 12613 1727096139.88976: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [0afff68d-5257-a9dd-d073-0000000001d2] 12613 1727096139.88978: sending task result for task 0afff68d-5257-a9dd-d073-0000000001d2 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 12613 1727096139.89132: no more pending results, returning what we have 12613 1727096139.89136: results queue empty 12613 1727096139.89137: checking for any_errors_fatal 12613 1727096139.89142: done checking for any_errors_fatal 12613 1727096139.89143: checking for max_fail_percentage 12613 1727096139.89144: done checking for max_fail_percentage 12613 1727096139.89145: checking to see if all hosts have failed and the running result is not ok 12613 1727096139.89146: done checking to see if all hosts have failed 12613 1727096139.89147: getting the remaining hosts for this loop 12613 1727096139.89148: done getting the remaining hosts for this loop 12613 1727096139.89155: getting the next task for host managed_node1 12613 1727096139.89161: done getting next task for host managed_node1 12613 1727096139.89164: ^ task is: TASK: Include the task 'enable_epel.yml' 12613 1727096139.89169: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096139.89173: getting variables 12613 1727096139.89175: in VariableManager get_vars() 12613 1727096139.89204: Calling all_inventory to load vars for managed_node1 12613 1727096139.89206: Calling groups_inventory to load vars for managed_node1 12613 1727096139.89210: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096139.89222: Calling all_plugins_play to load vars for managed_node1 12613 1727096139.89225: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096139.89228: Calling groups_plugins_play to load vars for managed_node1 12613 1727096139.89642: done sending task result for task 0afff68d-5257-a9dd-d073-0000000001d2 12613 1727096139.89646: WORKER PROCESS EXITING 12613 1727096139.89675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096139.89920: done with get_vars() 12613 1727096139.89930: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Monday 23 September 2024 08:55:39 -0400 (0:00:00.027) 0:00:03.537 ****** 12613 1727096139.90019: entering _queue_task() for managed_node1/include_tasks 12613 1727096139.90292: worker is 1 (out of 1 available) 12613 1727096139.90305: exiting _queue_task() for managed_node1/include_tasks 12613 1727096139.90317: done queuing things up, now waiting for results queue to drain 12613 1727096139.90319: waiting for pending results... 12613 1727096139.90516: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 12613 1727096139.90601: in run() - task 0afff68d-5257-a9dd-d073-0000000001d3 12613 1727096139.90614: variable 'ansible_search_path' from source: unknown 12613 1727096139.90617: variable 'ansible_search_path' from source: unknown 12613 1727096139.90650: calling self._execute() 12613 1727096139.90725: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096139.90728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096139.90739: variable 'omit' from source: magic vars 12613 1727096139.91123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096139.93375: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096139.93476: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096139.93534: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096139.93581: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096139.93615: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096139.93748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096139.93775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096139.93806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096139.93891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096139.94235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096139.94311: variable '__network_is_ostree' from source: set_fact 12613 1727096139.94334: Evaluated conditional (not __network_is_ostree | d(false)): True 12613 1727096139.94344: _execute() done 12613 1727096139.94351: dumping result to json 12613 1727096139.94357: done dumping result, returning 12613 1727096139.94369: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [0afff68d-5257-a9dd-d073-0000000001d3] 12613 1727096139.94378: sending task result for task 0afff68d-5257-a9dd-d073-0000000001d3 12613 1727096139.94511: no more pending results, returning what we have 12613 1727096139.94516: in VariableManager get_vars() 12613 1727096139.94548: Calling all_inventory to load vars for managed_node1 12613 1727096139.94553: Calling groups_inventory to load vars for managed_node1 12613 1727096139.94561: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096139.94573: Calling all_plugins_play to load vars for managed_node1 12613 1727096139.94576: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096139.94580: Calling groups_plugins_play to load vars for managed_node1 12613 1727096139.94842: done sending task result for task 0afff68d-5257-a9dd-d073-0000000001d3 12613 1727096139.94845: WORKER PROCESS EXITING 12613 1727096139.94871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096139.95070: done with get_vars() 12613 1727096139.95079: variable 'ansible_search_path' from source: unknown 12613 1727096139.95080: variable 'ansible_search_path' from source: unknown 12613 1727096139.95116: we have included files to process 12613 1727096139.95117: generating all_blocks data 12613 1727096139.95119: done generating all_blocks data 12613 1727096139.95125: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 12613 1727096139.95126: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 12613 1727096139.95128: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 12613 1727096139.95838: done processing included file 12613 1727096139.95841: iterating over new_blocks loaded from include file 12613 1727096139.95842: in VariableManager get_vars() 12613 1727096139.95854: done with get_vars() 12613 1727096139.95855: filtering new block on tags 12613 1727096139.95880: done filtering new block on tags 12613 1727096139.95883: in VariableManager get_vars() 12613 1727096139.95893: done with get_vars() 12613 1727096139.95894: filtering new block on tags 12613 1727096139.95907: done filtering new block on tags 12613 1727096139.95914: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 12613 1727096139.95919: extending task lists for all hosts with included blocks 12613 1727096139.96021: done extending task lists 12613 1727096139.96022: done processing included files 12613 1727096139.96023: results queue empty 12613 1727096139.96024: checking for any_errors_fatal 12613 1727096139.96028: done checking for any_errors_fatal 12613 1727096139.96028: checking for max_fail_percentage 12613 1727096139.96029: done checking for max_fail_percentage 12613 1727096139.96030: checking to see if all hosts have failed and the running result is not ok 12613 1727096139.96031: done checking to see if all hosts have failed 12613 1727096139.96031: getting the remaining hosts for this loop 12613 1727096139.96032: done getting the remaining hosts for this loop 12613 1727096139.96034: getting the next task for host managed_node1 12613 1727096139.96039: done getting next task for host managed_node1 12613 1727096139.96041: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 12613 1727096139.96043: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096139.96045: getting variables 12613 1727096139.96046: in VariableManager get_vars() 12613 1727096139.96054: Calling all_inventory to load vars for managed_node1 12613 1727096139.96056: Calling groups_inventory to load vars for managed_node1 12613 1727096139.96058: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096139.96063: Calling all_plugins_play to load vars for managed_node1 12613 1727096139.96073: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096139.96077: Calling groups_plugins_play to load vars for managed_node1 12613 1727096139.96238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096139.96438: done with get_vars() 12613 1727096139.96447: done getting variables 12613 1727096139.96519: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 12613 1727096139.96640: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Monday 23 September 2024 08:55:39 -0400 (0:00:00.066) 0:00:03.604 ****** 12613 1727096139.96690: entering _queue_task() for managed_node1/command 12613 1727096139.96691: Creating lock for command 12613 1727096139.97009: worker is 1 (out of 1 available) 12613 1727096139.97020: exiting _queue_task() for managed_node1/command 12613 1727096139.97031: done queuing things up, now waiting for results queue to drain 12613 1727096139.97033: waiting for pending results... 12613 1727096139.97342: running TaskExecutor() for managed_node1/TASK: Create EPEL 10 12613 1727096139.97416: in run() - task 0afff68d-5257-a9dd-d073-0000000001ed 12613 1727096139.97442: variable 'ansible_search_path' from source: unknown 12613 1727096139.97451: variable 'ansible_search_path' from source: unknown 12613 1727096139.97493: calling self._execute() 12613 1727096139.97656: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096139.97660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096139.97663: variable 'omit' from source: magic vars 12613 1727096139.97971: variable 'ansible_distribution' from source: facts 12613 1727096139.97992: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 12613 1727096139.98124: variable 'ansible_distribution_major_version' from source: facts 12613 1727096139.98137: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 12613 1727096139.98145: when evaluation is False, skipping this task 12613 1727096139.98152: _execute() done 12613 1727096139.98159: dumping result to json 12613 1727096139.98166: done dumping result, returning 12613 1727096139.98181: done running TaskExecutor() for managed_node1/TASK: Create EPEL 10 [0afff68d-5257-a9dd-d073-0000000001ed] 12613 1727096139.98191: sending task result for task 0afff68d-5257-a9dd-d073-0000000001ed skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 12613 1727096139.98375: no more pending results, returning what we have 12613 1727096139.98379: results queue empty 12613 1727096139.98380: checking for any_errors_fatal 12613 1727096139.98382: done checking for any_errors_fatal 12613 1727096139.98383: checking for max_fail_percentage 12613 1727096139.98385: done checking for max_fail_percentage 12613 1727096139.98386: checking to see if all hosts have failed and the running result is not ok 12613 1727096139.98387: done checking to see if all hosts have failed 12613 1727096139.98387: getting the remaining hosts for this loop 12613 1727096139.98389: done getting the remaining hosts for this loop 12613 1727096139.98392: getting the next task for host managed_node1 12613 1727096139.98401: done getting next task for host managed_node1 12613 1727096139.98404: ^ task is: TASK: Install yum-utils package 12613 1727096139.98408: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096139.98412: getting variables 12613 1727096139.98415: in VariableManager get_vars() 12613 1727096139.98447: Calling all_inventory to load vars for managed_node1 12613 1727096139.98450: Calling groups_inventory to load vars for managed_node1 12613 1727096139.98453: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096139.98685: Calling all_plugins_play to load vars for managed_node1 12613 1727096139.98689: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096139.98694: Calling groups_plugins_play to load vars for managed_node1 12613 1727096139.98937: done sending task result for task 0afff68d-5257-a9dd-d073-0000000001ed 12613 1727096139.98941: WORKER PROCESS EXITING 12613 1727096139.98964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096139.99164: done with get_vars() 12613 1727096139.99176: done getting variables 12613 1727096139.99280: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Monday 23 September 2024 08:55:39 -0400 (0:00:00.026) 0:00:03.630 ****** 12613 1727096139.99310: entering _queue_task() for managed_node1/package 12613 1727096139.99312: Creating lock for package 12613 1727096139.99608: worker is 1 (out of 1 available) 12613 1727096139.99621: exiting _queue_task() for managed_node1/package 12613 1727096139.99633: done queuing things up, now waiting for results queue to drain 12613 1727096139.99634: waiting for pending results... 12613 1727096139.99887: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 12613 1727096139.99998: in run() - task 0afff68d-5257-a9dd-d073-0000000001ee 12613 1727096140.00020: variable 'ansible_search_path' from source: unknown 12613 1727096140.00026: variable 'ansible_search_path' from source: unknown 12613 1727096140.00064: calling self._execute() 12613 1727096140.00148: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.00159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.00198: variable 'omit' from source: magic vars 12613 1727096140.00659: variable 'ansible_distribution' from source: facts 12613 1727096140.00677: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 12613 1727096140.00811: variable 'ansible_distribution_major_version' from source: facts 12613 1727096140.00851: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 12613 1727096140.00855: when evaluation is False, skipping this task 12613 1727096140.00858: _execute() done 12613 1727096140.00860: dumping result to json 12613 1727096140.00863: done dumping result, returning 12613 1727096140.00873: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [0afff68d-5257-a9dd-d073-0000000001ee] 12613 1727096140.00875: sending task result for task 0afff68d-5257-a9dd-d073-0000000001ee skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 12613 1727096140.01116: no more pending results, returning what we have 12613 1727096140.01119: results queue empty 12613 1727096140.01120: checking for any_errors_fatal 12613 1727096140.01125: done checking for any_errors_fatal 12613 1727096140.01126: checking for max_fail_percentage 12613 1727096140.01127: done checking for max_fail_percentage 12613 1727096140.01128: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.01129: done checking to see if all hosts have failed 12613 1727096140.01130: getting the remaining hosts for this loop 12613 1727096140.01131: done getting the remaining hosts for this loop 12613 1727096140.01134: getting the next task for host managed_node1 12613 1727096140.01143: done getting next task for host managed_node1 12613 1727096140.01145: ^ task is: TASK: Enable EPEL 7 12613 1727096140.01149: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.01152: getting variables 12613 1727096140.01154: in VariableManager get_vars() 12613 1727096140.01186: Calling all_inventory to load vars for managed_node1 12613 1727096140.01189: Calling groups_inventory to load vars for managed_node1 12613 1727096140.01192: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.01205: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.01209: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.01212: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.01585: done sending task result for task 0afff68d-5257-a9dd-d073-0000000001ee 12613 1727096140.01588: WORKER PROCESS EXITING 12613 1727096140.01614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.01819: done with get_vars() 12613 1727096140.01830: done getting variables 12613 1727096140.01887: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Monday 23 September 2024 08:55:40 -0400 (0:00:00.026) 0:00:03.656 ****** 12613 1727096140.01922: entering _queue_task() for managed_node1/command 12613 1727096140.02182: worker is 1 (out of 1 available) 12613 1727096140.02196: exiting _queue_task() for managed_node1/command 12613 1727096140.02208: done queuing things up, now waiting for results queue to drain 12613 1727096140.02209: waiting for pending results... 12613 1727096140.02410: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 12613 1727096140.02511: in run() - task 0afff68d-5257-a9dd-d073-0000000001ef 12613 1727096140.02524: variable 'ansible_search_path' from source: unknown 12613 1727096140.02528: variable 'ansible_search_path' from source: unknown 12613 1727096140.02563: calling self._execute() 12613 1727096140.02635: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.02639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.02651: variable 'omit' from source: magic vars 12613 1727096140.03030: variable 'ansible_distribution' from source: facts 12613 1727096140.03047: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 12613 1727096140.03186: variable 'ansible_distribution_major_version' from source: facts 12613 1727096140.03197: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 12613 1727096140.03204: when evaluation is False, skipping this task 12613 1727096140.03210: _execute() done 12613 1727096140.03217: dumping result to json 12613 1727096140.03224: done dumping result, returning 12613 1727096140.03242: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [0afff68d-5257-a9dd-d073-0000000001ef] 12613 1727096140.03255: sending task result for task 0afff68d-5257-a9dd-d073-0000000001ef skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 12613 1727096140.03515: no more pending results, returning what we have 12613 1727096140.03519: results queue empty 12613 1727096140.03519: checking for any_errors_fatal 12613 1727096140.03526: done checking for any_errors_fatal 12613 1727096140.03526: checking for max_fail_percentage 12613 1727096140.03528: done checking for max_fail_percentage 12613 1727096140.03528: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.03529: done checking to see if all hosts have failed 12613 1727096140.03530: getting the remaining hosts for this loop 12613 1727096140.03531: done getting the remaining hosts for this loop 12613 1727096140.03534: getting the next task for host managed_node1 12613 1727096140.03541: done getting next task for host managed_node1 12613 1727096140.03543: ^ task is: TASK: Enable EPEL 8 12613 1727096140.03547: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.03551: getting variables 12613 1727096140.03553: in VariableManager get_vars() 12613 1727096140.03579: Calling all_inventory to load vars for managed_node1 12613 1727096140.03582: Calling groups_inventory to load vars for managed_node1 12613 1727096140.03584: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.03590: done sending task result for task 0afff68d-5257-a9dd-d073-0000000001ef 12613 1727096140.03594: WORKER PROCESS EXITING 12613 1727096140.03654: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.03657: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.03661: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.03830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.04029: done with get_vars() 12613 1727096140.04040: done getting variables 12613 1727096140.04111: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Monday 23 September 2024 08:55:40 -0400 (0:00:00.022) 0:00:03.678 ****** 12613 1727096140.04142: entering _queue_task() for managed_node1/command 12613 1727096140.04588: worker is 1 (out of 1 available) 12613 1727096140.04600: exiting _queue_task() for managed_node1/command 12613 1727096140.04612: done queuing things up, now waiting for results queue to drain 12613 1727096140.04613: waiting for pending results... 12613 1727096140.04808: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 12613 1727096140.04944: in run() - task 0afff68d-5257-a9dd-d073-0000000001f0 12613 1727096140.04984: variable 'ansible_search_path' from source: unknown 12613 1727096140.04988: variable 'ansible_search_path' from source: unknown 12613 1727096140.05093: calling self._execute() 12613 1727096140.05129: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.05140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.05159: variable 'omit' from source: magic vars 12613 1727096140.05661: variable 'ansible_distribution' from source: facts 12613 1727096140.05873: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 12613 1727096140.05876: variable 'ansible_distribution_major_version' from source: facts 12613 1727096140.05878: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 12613 1727096140.05881: when evaluation is False, skipping this task 12613 1727096140.05883: _execute() done 12613 1727096140.05885: dumping result to json 12613 1727096140.05887: done dumping result, returning 12613 1727096140.05890: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [0afff68d-5257-a9dd-d073-0000000001f0] 12613 1727096140.05892: sending task result for task 0afff68d-5257-a9dd-d073-0000000001f0 12613 1727096140.05960: done sending task result for task 0afff68d-5257-a9dd-d073-0000000001f0 12613 1727096140.05963: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 12613 1727096140.06015: no more pending results, returning what we have 12613 1727096140.06018: results queue empty 12613 1727096140.06019: checking for any_errors_fatal 12613 1727096140.06026: done checking for any_errors_fatal 12613 1727096140.06026: checking for max_fail_percentage 12613 1727096140.06028: done checking for max_fail_percentage 12613 1727096140.06029: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.06030: done checking to see if all hosts have failed 12613 1727096140.06030: getting the remaining hosts for this loop 12613 1727096140.06032: done getting the remaining hosts for this loop 12613 1727096140.06035: getting the next task for host managed_node1 12613 1727096140.06045: done getting next task for host managed_node1 12613 1727096140.06048: ^ task is: TASK: Enable EPEL 6 12613 1727096140.06055: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.06058: getting variables 12613 1727096140.06060: in VariableManager get_vars() 12613 1727096140.06095: Calling all_inventory to load vars for managed_node1 12613 1727096140.06098: Calling groups_inventory to load vars for managed_node1 12613 1727096140.06101: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.06114: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.06117: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.06119: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.06512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.06764: done with get_vars() 12613 1727096140.06809: done getting variables 12613 1727096140.06872: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Monday 23 September 2024 08:55:40 -0400 (0:00:00.027) 0:00:03.706 ****** 12613 1727096140.06902: entering _queue_task() for managed_node1/copy 12613 1727096140.07175: worker is 1 (out of 1 available) 12613 1727096140.07186: exiting _queue_task() for managed_node1/copy 12613 1727096140.07198: done queuing things up, now waiting for results queue to drain 12613 1727096140.07199: waiting for pending results... 12613 1727096140.07545: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 12613 1727096140.07550: in run() - task 0afff68d-5257-a9dd-d073-0000000001f2 12613 1727096140.07552: variable 'ansible_search_path' from source: unknown 12613 1727096140.07556: variable 'ansible_search_path' from source: unknown 12613 1727096140.07592: calling self._execute() 12613 1727096140.07672: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.07685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.07700: variable 'omit' from source: magic vars 12613 1727096140.08072: variable 'ansible_distribution' from source: facts 12613 1727096140.08089: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 12613 1727096140.08206: variable 'ansible_distribution_major_version' from source: facts 12613 1727096140.08373: Evaluated conditional (ansible_distribution_major_version == '6'): False 12613 1727096140.08377: when evaluation is False, skipping this task 12613 1727096140.08379: _execute() done 12613 1727096140.08382: dumping result to json 12613 1727096140.08383: done dumping result, returning 12613 1727096140.08386: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [0afff68d-5257-a9dd-d073-0000000001f2] 12613 1727096140.08388: sending task result for task 0afff68d-5257-a9dd-d073-0000000001f2 12613 1727096140.08462: done sending task result for task 0afff68d-5257-a9dd-d073-0000000001f2 12613 1727096140.08467: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 12613 1727096140.08514: no more pending results, returning what we have 12613 1727096140.08517: results queue empty 12613 1727096140.08517: checking for any_errors_fatal 12613 1727096140.08523: done checking for any_errors_fatal 12613 1727096140.08524: checking for max_fail_percentage 12613 1727096140.08525: done checking for max_fail_percentage 12613 1727096140.08526: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.08527: done checking to see if all hosts have failed 12613 1727096140.08527: getting the remaining hosts for this loop 12613 1727096140.08528: done getting the remaining hosts for this loop 12613 1727096140.08531: getting the next task for host managed_node1 12613 1727096140.08541: done getting next task for host managed_node1 12613 1727096140.08544: ^ task is: TASK: Set network provider to 'initscripts' 12613 1727096140.08546: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.08549: getting variables 12613 1727096140.08553: in VariableManager get_vars() 12613 1727096140.08581: Calling all_inventory to load vars for managed_node1 12613 1727096140.08583: Calling groups_inventory to load vars for managed_node1 12613 1727096140.08586: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.08595: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.08597: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.08599: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.08817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.09023: done with get_vars() 12613 1727096140.09034: done getting variables 12613 1727096140.09100: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'initscripts'] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_initscripts.yml:12 Monday 23 September 2024 08:55:40 -0400 (0:00:00.022) 0:00:03.728 ****** 12613 1727096140.09128: entering _queue_task() for managed_node1/set_fact 12613 1727096140.09593: worker is 1 (out of 1 available) 12613 1727096140.09601: exiting _queue_task() for managed_node1/set_fact 12613 1727096140.09610: done queuing things up, now waiting for results queue to drain 12613 1727096140.09611: waiting for pending results... 12613 1727096140.09739: running TaskExecutor() for managed_node1/TASK: Set network provider to 'initscripts' 12613 1727096140.09840: in run() - task 0afff68d-5257-a9dd-d073-000000000007 12613 1727096140.09844: variable 'ansible_search_path' from source: unknown 12613 1727096140.09884: calling self._execute() 12613 1727096140.10061: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.10065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.10069: variable 'omit' from source: magic vars 12613 1727096140.10128: variable 'omit' from source: magic vars 12613 1727096140.10203: variable 'omit' from source: magic vars 12613 1727096140.10245: variable 'omit' from source: magic vars 12613 1727096140.10304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12613 1727096140.10374: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12613 1727096140.10410: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12613 1727096140.10440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12613 1727096140.10475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12613 1727096140.10518: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12613 1727096140.10528: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.10535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.10648: Set connection var ansible_connection to ssh 12613 1727096140.10664: Set connection var ansible_module_compression to ZIP_DEFLATED 12613 1727096140.10681: Set connection var ansible_timeout to 10 12613 1727096140.10693: Set connection var ansible_shell_type to sh 12613 1727096140.10708: Set connection var ansible_pipelining to False 12613 1727096140.10818: Set connection var ansible_shell_executable to /bin/sh 12613 1727096140.10821: variable 'ansible_shell_executable' from source: unknown 12613 1727096140.10823: variable 'ansible_connection' from source: unknown 12613 1727096140.10825: variable 'ansible_module_compression' from source: unknown 12613 1727096140.10827: variable 'ansible_shell_type' from source: unknown 12613 1727096140.10830: variable 'ansible_shell_executable' from source: unknown 12613 1727096140.10832: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.10834: variable 'ansible_pipelining' from source: unknown 12613 1727096140.10835: variable 'ansible_timeout' from source: unknown 12613 1727096140.10837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.10969: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 12613 1727096140.10987: variable 'omit' from source: magic vars 12613 1727096140.11021: starting attempt loop 12613 1727096140.11046: running the handler 12613 1727096140.11143: handler run complete 12613 1727096140.11257: attempt loop complete, returning result 12613 1727096140.11260: _execute() done 12613 1727096140.11263: dumping result to json 12613 1727096140.11265: done dumping result, returning 12613 1727096140.11268: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'initscripts' [0afff68d-5257-a9dd-d073-000000000007] 12613 1727096140.11270: sending task result for task 0afff68d-5257-a9dd-d073-000000000007 12613 1727096140.11343: done sending task result for task 0afff68d-5257-a9dd-d073-000000000007 12613 1727096140.11346: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "network_provider": "initscripts" }, "changed": false } 12613 1727096140.11407: no more pending results, returning what we have 12613 1727096140.11410: results queue empty 12613 1727096140.11411: checking for any_errors_fatal 12613 1727096140.11416: done checking for any_errors_fatal 12613 1727096140.11416: checking for max_fail_percentage 12613 1727096140.11418: done checking for max_fail_percentage 12613 1727096140.11419: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.11419: done checking to see if all hosts have failed 12613 1727096140.11420: getting the remaining hosts for this loop 12613 1727096140.11421: done getting the remaining hosts for this loop 12613 1727096140.11425: getting the next task for host managed_node1 12613 1727096140.11433: done getting next task for host managed_node1 12613 1727096140.11435: ^ task is: TASK: meta (flush_handlers) 12613 1727096140.11437: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.11441: getting variables 12613 1727096140.11443: in VariableManager get_vars() 12613 1727096140.11478: Calling all_inventory to load vars for managed_node1 12613 1727096140.11480: Calling groups_inventory to load vars for managed_node1 12613 1727096140.11483: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.11494: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.11497: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.11499: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.11907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.12146: done with get_vars() 12613 1727096140.12159: done getting variables 12613 1727096140.12232: in VariableManager get_vars() 12613 1727096140.12242: Calling all_inventory to load vars for managed_node1 12613 1727096140.12245: Calling groups_inventory to load vars for managed_node1 12613 1727096140.12247: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.12254: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.12256: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.12259: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.12403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.12630: done with get_vars() 12613 1727096140.12645: done queuing things up, now waiting for results queue to drain 12613 1727096140.12647: results queue empty 12613 1727096140.12647: checking for any_errors_fatal 12613 1727096140.12650: done checking for any_errors_fatal 12613 1727096140.12657: checking for max_fail_percentage 12613 1727096140.12659: done checking for max_fail_percentage 12613 1727096140.12659: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.12660: done checking to see if all hosts have failed 12613 1727096140.12661: getting the remaining hosts for this loop 12613 1727096140.12662: done getting the remaining hosts for this loop 12613 1727096140.12665: getting the next task for host managed_node1 12613 1727096140.12670: done getting next task for host managed_node1 12613 1727096140.12672: ^ task is: TASK: meta (flush_handlers) 12613 1727096140.12673: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.12682: getting variables 12613 1727096140.12683: in VariableManager get_vars() 12613 1727096140.12690: Calling all_inventory to load vars for managed_node1 12613 1727096140.12692: Calling groups_inventory to load vars for managed_node1 12613 1727096140.12695: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.12699: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.12702: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.12704: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.12847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.13043: done with get_vars() 12613 1727096140.13053: done getting variables 12613 1727096140.13107: in VariableManager get_vars() 12613 1727096140.13116: Calling all_inventory to load vars for managed_node1 12613 1727096140.13118: Calling groups_inventory to load vars for managed_node1 12613 1727096140.13120: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.13124: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.13126: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.13129: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.13276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.13689: done with get_vars() 12613 1727096140.13701: done queuing things up, now waiting for results queue to drain 12613 1727096140.13702: results queue empty 12613 1727096140.13703: checking for any_errors_fatal 12613 1727096140.13704: done checking for any_errors_fatal 12613 1727096140.13705: checking for max_fail_percentage 12613 1727096140.13706: done checking for max_fail_percentage 12613 1727096140.13707: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.13707: done checking to see if all hosts have failed 12613 1727096140.13708: getting the remaining hosts for this loop 12613 1727096140.13709: done getting the remaining hosts for this loop 12613 1727096140.13711: getting the next task for host managed_node1 12613 1727096140.13714: done getting next task for host managed_node1 12613 1727096140.13715: ^ task is: None 12613 1727096140.13717: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.13718: done queuing things up, now waiting for results queue to drain 12613 1727096140.13718: results queue empty 12613 1727096140.13719: checking for any_errors_fatal 12613 1727096140.13720: done checking for any_errors_fatal 12613 1727096140.13720: checking for max_fail_percentage 12613 1727096140.13721: done checking for max_fail_percentage 12613 1727096140.13722: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.13723: done checking to see if all hosts have failed 12613 1727096140.13724: getting the next task for host managed_node1 12613 1727096140.13726: done getting next task for host managed_node1 12613 1727096140.13727: ^ task is: None 12613 1727096140.13728: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.13779: in VariableManager get_vars() 12613 1727096140.13812: done with get_vars() 12613 1727096140.13818: in VariableManager get_vars() 12613 1727096140.13838: done with get_vars() 12613 1727096140.13857: variable 'omit' from source: magic vars 12613 1727096140.13891: in VariableManager get_vars() 12613 1727096140.13914: done with get_vars() 12613 1727096140.13936: variable 'omit' from source: magic vars PLAY [Play for testing bond removal] ******************************************* 12613 1727096140.14896: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 12613 1727096140.14921: getting the remaining hosts for this loop 12613 1727096140.14922: done getting the remaining hosts for this loop 12613 1727096140.14925: getting the next task for host managed_node1 12613 1727096140.14927: done getting next task for host managed_node1 12613 1727096140.14933: ^ task is: TASK: Gathering Facts 12613 1727096140.14935: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.14936: getting variables 12613 1727096140.14937: in VariableManager get_vars() 12613 1727096140.14955: Calling all_inventory to load vars for managed_node1 12613 1727096140.14957: Calling groups_inventory to load vars for managed_node1 12613 1727096140.14959: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.14964: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.14977: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.14980: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.15103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.15299: done with get_vars() 12613 1727096140.15307: done getting variables 12613 1727096140.15350: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:3 Monday 23 September 2024 08:55:40 -0400 (0:00:00.062) 0:00:03.791 ****** 12613 1727096140.15387: entering _queue_task() for managed_node1/gather_facts 12613 1727096140.15770: worker is 1 (out of 1 available) 12613 1727096140.15783: exiting _queue_task() for managed_node1/gather_facts 12613 1727096140.15795: done queuing things up, now waiting for results queue to drain 12613 1727096140.15796: waiting for pending results... 12613 1727096140.16060: running TaskExecutor() for managed_node1/TASK: Gathering Facts 12613 1727096140.16234: in run() - task 0afff68d-5257-a9dd-d073-000000000218 12613 1727096140.16238: variable 'ansible_search_path' from source: unknown 12613 1727096140.16242: calling self._execute() 12613 1727096140.16338: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.16350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.16377: variable 'omit' from source: magic vars 12613 1727096140.17104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096140.19509: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096140.19593: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096140.19635: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096140.19680: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096140.19716: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096140.19804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096140.19844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096140.19878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096140.19929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096140.19949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096140.20088: variable 'ansible_distribution' from source: facts 12613 1727096140.20103: variable 'ansible_distribution_major_version' from source: facts 12613 1727096140.20123: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096140.20130: when evaluation is False, skipping this task 12613 1727096140.20135: _execute() done 12613 1727096140.20141: dumping result to json 12613 1727096140.20147: done dumping result, returning 12613 1727096140.20160: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-a9dd-d073-000000000218] 12613 1727096140.20172: sending task result for task 0afff68d-5257-a9dd-d073-000000000218 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096140.20357: no more pending results, returning what we have 12613 1727096140.20361: results queue empty 12613 1727096140.20362: checking for any_errors_fatal 12613 1727096140.20363: done checking for any_errors_fatal 12613 1727096140.20363: checking for max_fail_percentage 12613 1727096140.20365: done checking for max_fail_percentage 12613 1727096140.20366: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.20367: done checking to see if all hosts have failed 12613 1727096140.20369: getting the remaining hosts for this loop 12613 1727096140.20371: done getting the remaining hosts for this loop 12613 1727096140.20375: getting the next task for host managed_node1 12613 1727096140.20382: done getting next task for host managed_node1 12613 1727096140.20384: ^ task is: TASK: meta (flush_handlers) 12613 1727096140.20385: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.20389: getting variables 12613 1727096140.20391: in VariableManager get_vars() 12613 1727096140.20450: Calling all_inventory to load vars for managed_node1 12613 1727096140.20455: Calling groups_inventory to load vars for managed_node1 12613 1727096140.20458: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.20682: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.20686: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.20690: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.20980: done sending task result for task 0afff68d-5257-a9dd-d073-000000000218 12613 1727096140.20984: WORKER PROCESS EXITING 12613 1727096140.21009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.21214: done with get_vars() 12613 1727096140.21224: done getting variables 12613 1727096140.21285: in VariableManager get_vars() 12613 1727096140.21302: Calling all_inventory to load vars for managed_node1 12613 1727096140.21304: Calling groups_inventory to load vars for managed_node1 12613 1727096140.21305: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.21309: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.21316: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.21318: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.21443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.21650: done with get_vars() 12613 1727096140.21669: done queuing things up, now waiting for results queue to drain 12613 1727096140.21671: results queue empty 12613 1727096140.21672: checking for any_errors_fatal 12613 1727096140.21674: done checking for any_errors_fatal 12613 1727096140.21675: checking for max_fail_percentage 12613 1727096140.21676: done checking for max_fail_percentage 12613 1727096140.21676: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.21677: done checking to see if all hosts have failed 12613 1727096140.21678: getting the remaining hosts for this loop 12613 1727096140.21679: done getting the remaining hosts for this loop 12613 1727096140.21682: getting the next task for host managed_node1 12613 1727096140.21686: done getting next task for host managed_node1 12613 1727096140.21688: ^ task is: TASK: INIT Prepare setup 12613 1727096140.21690: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.21692: getting variables 12613 1727096140.21693: in VariableManager get_vars() 12613 1727096140.21713: Calling all_inventory to load vars for managed_node1 12613 1727096140.21715: Calling groups_inventory to load vars for managed_node1 12613 1727096140.21716: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.21721: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.21729: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.21732: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.21879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.22101: done with get_vars() 12613 1727096140.22109: done getting variables 12613 1727096140.22198: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:15 Monday 23 September 2024 08:55:40 -0400 (0:00:00.068) 0:00:03.859 ****** 12613 1727096140.22226: entering _queue_task() for managed_node1/debug 12613 1727096140.22228: Creating lock for debug 12613 1727096140.22653: worker is 1 (out of 1 available) 12613 1727096140.22665: exiting _queue_task() for managed_node1/debug 12613 1727096140.22678: done queuing things up, now waiting for results queue to drain 12613 1727096140.22680: waiting for pending results... 12613 1727096140.22886: running TaskExecutor() for managed_node1/TASK: INIT Prepare setup 12613 1727096140.23003: in run() - task 0afff68d-5257-a9dd-d073-00000000000b 12613 1727096140.23023: variable 'ansible_search_path' from source: unknown 12613 1727096140.23076: calling self._execute() 12613 1727096140.23173: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.23186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.23201: variable 'omit' from source: magic vars 12613 1727096140.23678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096140.26013: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096140.26103: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096140.26144: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096140.26188: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096140.26223: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096140.26308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096140.26372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096140.26384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096140.26433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096140.26454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096140.26596: variable 'ansible_distribution' from source: facts 12613 1727096140.26606: variable 'ansible_distribution_major_version' from source: facts 12613 1727096140.26628: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096140.26636: when evaluation is False, skipping this task 12613 1727096140.26650: _execute() done 12613 1727096140.26659: dumping result to json 12613 1727096140.26757: done dumping result, returning 12613 1727096140.26760: done running TaskExecutor() for managed_node1/TASK: INIT Prepare setup [0afff68d-5257-a9dd-d073-00000000000b] 12613 1727096140.26762: sending task result for task 0afff68d-5257-a9dd-d073-00000000000b 12613 1727096140.26833: done sending task result for task 0afff68d-5257-a9dd-d073-00000000000b 12613 1727096140.26835: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096140.26905: no more pending results, returning what we have 12613 1727096140.26909: results queue empty 12613 1727096140.26909: checking for any_errors_fatal 12613 1727096140.26911: done checking for any_errors_fatal 12613 1727096140.26912: checking for max_fail_percentage 12613 1727096140.26913: done checking for max_fail_percentage 12613 1727096140.26914: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.26915: done checking to see if all hosts have failed 12613 1727096140.26916: getting the remaining hosts for this loop 12613 1727096140.26917: done getting the remaining hosts for this loop 12613 1727096140.26921: getting the next task for host managed_node1 12613 1727096140.26927: done getting next task for host managed_node1 12613 1727096140.26931: ^ task is: TASK: Install dnsmasq 12613 1727096140.26934: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.26938: getting variables 12613 1727096140.26940: in VariableManager get_vars() 12613 1727096140.27005: Calling all_inventory to load vars for managed_node1 12613 1727096140.27008: Calling groups_inventory to load vars for managed_node1 12613 1727096140.27010: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.27022: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.27024: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.27028: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.27327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.27662: done with get_vars() 12613 1727096140.27674: done getting variables 12613 1727096140.27729: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Monday 23 September 2024 08:55:40 -0400 (0:00:00.055) 0:00:03.914 ****** 12613 1727096140.27758: entering _queue_task() for managed_node1/package 12613 1727096140.28055: worker is 1 (out of 1 available) 12613 1727096140.28070: exiting _queue_task() for managed_node1/package 12613 1727096140.28081: done queuing things up, now waiting for results queue to drain 12613 1727096140.28082: waiting for pending results... 12613 1727096140.28363: running TaskExecutor() for managed_node1/TASK: Install dnsmasq 12613 1727096140.28436: in run() - task 0afff68d-5257-a9dd-d073-00000000000f 12613 1727096140.28469: variable 'ansible_search_path' from source: unknown 12613 1727096140.28478: variable 'ansible_search_path' from source: unknown 12613 1727096140.28519: calling self._execute() 12613 1727096140.28677: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.28680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.28683: variable 'omit' from source: magic vars 12613 1727096140.29118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096140.31449: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096140.31538: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096140.31591: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096140.31641: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096140.31677: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096140.31766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096140.31834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096140.31840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096140.31892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096140.31913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096140.32076: variable 'ansible_distribution' from source: facts 12613 1727096140.32161: variable 'ansible_distribution_major_version' from source: facts 12613 1727096140.32164: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096140.32168: when evaluation is False, skipping this task 12613 1727096140.32171: _execute() done 12613 1727096140.32173: dumping result to json 12613 1727096140.32175: done dumping result, returning 12613 1727096140.32177: done running TaskExecutor() for managed_node1/TASK: Install dnsmasq [0afff68d-5257-a9dd-d073-00000000000f] 12613 1727096140.32179: sending task result for task 0afff68d-5257-a9dd-d073-00000000000f skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096140.32430: no more pending results, returning what we have 12613 1727096140.32434: results queue empty 12613 1727096140.32435: checking for any_errors_fatal 12613 1727096140.32442: done checking for any_errors_fatal 12613 1727096140.32442: checking for max_fail_percentage 12613 1727096140.32445: done checking for max_fail_percentage 12613 1727096140.32445: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.32446: done checking to see if all hosts have failed 12613 1727096140.32447: getting the remaining hosts for this loop 12613 1727096140.32449: done getting the remaining hosts for this loop 12613 1727096140.32456: getting the next task for host managed_node1 12613 1727096140.32463: done getting next task for host managed_node1 12613 1727096140.32466: ^ task is: TASK: Install pgrep, sysctl 12613 1727096140.32471: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.32474: getting variables 12613 1727096140.32477: in VariableManager get_vars() 12613 1727096140.32537: Calling all_inventory to load vars for managed_node1 12613 1727096140.32540: Calling groups_inventory to load vars for managed_node1 12613 1727096140.32543: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.32557: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.32560: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.32564: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.32691: done sending task result for task 0afff68d-5257-a9dd-d073-00000000000f 12613 1727096140.32694: WORKER PROCESS EXITING 12613 1727096140.33057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.33245: done with get_vars() 12613 1727096140.33259: done getting variables 12613 1727096140.33323: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Monday 23 September 2024 08:55:40 -0400 (0:00:00.055) 0:00:03.970 ****** 12613 1727096140.33355: entering _queue_task() for managed_node1/package 12613 1727096140.33696: worker is 1 (out of 1 available) 12613 1727096140.33710: exiting _queue_task() for managed_node1/package 12613 1727096140.33722: done queuing things up, now waiting for results queue to drain 12613 1727096140.33725: waiting for pending results... 12613 1727096140.33984: running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl 12613 1727096140.34101: in run() - task 0afff68d-5257-a9dd-d073-000000000010 12613 1727096140.34120: variable 'ansible_search_path' from source: unknown 12613 1727096140.34127: variable 'ansible_search_path' from source: unknown 12613 1727096140.34172: calling self._execute() 12613 1727096140.34274: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.34277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.34285: variable 'omit' from source: magic vars 12613 1727096140.34711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096140.36995: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096140.37054: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096140.37173: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096140.37177: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096140.37180: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096140.37255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096140.37293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096140.37328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096140.37373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096140.37392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096140.37539: variable 'ansible_distribution' from source: facts 12613 1727096140.37550: variable 'ansible_distribution_major_version' from source: facts 12613 1727096140.37574: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096140.37582: when evaluation is False, skipping this task 12613 1727096140.37588: _execute() done 12613 1727096140.37593: dumping result to json 12613 1727096140.37600: done dumping result, returning 12613 1727096140.37609: done running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl [0afff68d-5257-a9dd-d073-000000000010] 12613 1727096140.37617: sending task result for task 0afff68d-5257-a9dd-d073-000000000010 12613 1727096140.37828: done sending task result for task 0afff68d-5257-a9dd-d073-000000000010 12613 1727096140.37833: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096140.37883: no more pending results, returning what we have 12613 1727096140.37886: results queue empty 12613 1727096140.37887: checking for any_errors_fatal 12613 1727096140.37891: done checking for any_errors_fatal 12613 1727096140.37892: checking for max_fail_percentage 12613 1727096140.37894: done checking for max_fail_percentage 12613 1727096140.37894: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.37895: done checking to see if all hosts have failed 12613 1727096140.37896: getting the remaining hosts for this loop 12613 1727096140.37897: done getting the remaining hosts for this loop 12613 1727096140.37902: getting the next task for host managed_node1 12613 1727096140.37908: done getting next task for host managed_node1 12613 1727096140.37910: ^ task is: TASK: Install pgrep, sysctl 12613 1727096140.37913: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.37916: getting variables 12613 1727096140.37918: in VariableManager get_vars() 12613 1727096140.37971: Calling all_inventory to load vars for managed_node1 12613 1727096140.37974: Calling groups_inventory to load vars for managed_node1 12613 1727096140.37976: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.37985: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.37988: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.37990: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.38165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.38388: done with get_vars() 12613 1727096140.38400: done getting variables 12613 1727096140.38458: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Monday 23 September 2024 08:55:40 -0400 (0:00:00.051) 0:00:04.022 ****** 12613 1727096140.38489: entering _queue_task() for managed_node1/package 12613 1727096140.38842: worker is 1 (out of 1 available) 12613 1727096140.38857: exiting _queue_task() for managed_node1/package 12613 1727096140.38927: done queuing things up, now waiting for results queue to drain 12613 1727096140.38929: waiting for pending results... 12613 1727096140.39162: running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl 12613 1727096140.39256: in run() - task 0afff68d-5257-a9dd-d073-000000000011 12613 1727096140.39260: variable 'ansible_search_path' from source: unknown 12613 1727096140.39273: variable 'ansible_search_path' from source: unknown 12613 1727096140.39363: calling self._execute() 12613 1727096140.39407: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.39416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.39429: variable 'omit' from source: magic vars 12613 1727096140.39991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096140.42339: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096140.42425: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096140.42484: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096140.42615: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096140.42619: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096140.42663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096140.42700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096140.42736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096140.42784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096140.42804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096140.42960: variable 'ansible_distribution' from source: facts 12613 1727096140.42974: variable 'ansible_distribution_major_version' from source: facts 12613 1727096140.42996: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096140.43003: when evaluation is False, skipping this task 12613 1727096140.43009: _execute() done 12613 1727096140.43016: dumping result to json 12613 1727096140.43023: done dumping result, returning 12613 1727096140.43034: done running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl [0afff68d-5257-a9dd-d073-000000000011] 12613 1727096140.43047: sending task result for task 0afff68d-5257-a9dd-d073-000000000011 12613 1727096140.43235: done sending task result for task 0afff68d-5257-a9dd-d073-000000000011 12613 1727096140.43238: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096140.43311: no more pending results, returning what we have 12613 1727096140.43314: results queue empty 12613 1727096140.43316: checking for any_errors_fatal 12613 1727096140.43323: done checking for any_errors_fatal 12613 1727096140.43324: checking for max_fail_percentage 12613 1727096140.43326: done checking for max_fail_percentage 12613 1727096140.43327: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.43328: done checking to see if all hosts have failed 12613 1727096140.43328: getting the remaining hosts for this loop 12613 1727096140.43329: done getting the remaining hosts for this loop 12613 1727096140.43333: getting the next task for host managed_node1 12613 1727096140.43340: done getting next task for host managed_node1 12613 1727096140.43343: ^ task is: TASK: Create test interfaces 12613 1727096140.43345: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.43349: getting variables 12613 1727096140.43353: in VariableManager get_vars() 12613 1727096140.43607: Calling all_inventory to load vars for managed_node1 12613 1727096140.43610: Calling groups_inventory to load vars for managed_node1 12613 1727096140.43612: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.43622: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.43624: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.43627: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.43905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.44120: done with get_vars() 12613 1727096140.44132: done getting variables 12613 1727096140.44232: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Monday 23 September 2024 08:55:40 -0400 (0:00:00.057) 0:00:04.079 ****** 12613 1727096140.44265: entering _queue_task() for managed_node1/shell 12613 1727096140.44269: Creating lock for shell 12613 1727096140.44615: worker is 1 (out of 1 available) 12613 1727096140.44629: exiting _queue_task() for managed_node1/shell 12613 1727096140.44647: done queuing things up, now waiting for results queue to drain 12613 1727096140.44648: waiting for pending results... 12613 1727096140.44890: running TaskExecutor() for managed_node1/TASK: Create test interfaces 12613 1727096140.45009: in run() - task 0afff68d-5257-a9dd-d073-000000000012 12613 1727096140.45063: variable 'ansible_search_path' from source: unknown 12613 1727096140.45068: variable 'ansible_search_path' from source: unknown 12613 1727096140.45086: calling self._execute() 12613 1727096140.45183: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.45195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.45272: variable 'omit' from source: magic vars 12613 1727096140.45664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096140.47982: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096140.48060: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096140.48273: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096140.48277: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096140.48280: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096140.48282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096140.48293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096140.48322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096140.48365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096140.48386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096140.48531: variable 'ansible_distribution' from source: facts 12613 1727096140.48542: variable 'ansible_distribution_major_version' from source: facts 12613 1727096140.48564: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096140.48573: when evaluation is False, skipping this task 12613 1727096140.48579: _execute() done 12613 1727096140.48585: dumping result to json 12613 1727096140.48592: done dumping result, returning 12613 1727096140.48602: done running TaskExecutor() for managed_node1/TASK: Create test interfaces [0afff68d-5257-a9dd-d073-000000000012] 12613 1727096140.48615: sending task result for task 0afff68d-5257-a9dd-d073-000000000012 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096140.48763: no more pending results, returning what we have 12613 1727096140.48767: results queue empty 12613 1727096140.48772: checking for any_errors_fatal 12613 1727096140.48778: done checking for any_errors_fatal 12613 1727096140.48778: checking for max_fail_percentage 12613 1727096140.48780: done checking for max_fail_percentage 12613 1727096140.48781: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.48781: done checking to see if all hosts have failed 12613 1727096140.48782: getting the remaining hosts for this loop 12613 1727096140.48784: done getting the remaining hosts for this loop 12613 1727096140.48787: getting the next task for host managed_node1 12613 1727096140.48796: done getting next task for host managed_node1 12613 1727096140.48799: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12613 1727096140.48802: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.48805: getting variables 12613 1727096140.48807: in VariableManager get_vars() 12613 1727096140.48860: Calling all_inventory to load vars for managed_node1 12613 1727096140.48863: Calling groups_inventory to load vars for managed_node1 12613 1727096140.48865: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.49055: done sending task result for task 0afff68d-5257-a9dd-d073-000000000012 12613 1727096140.49058: WORKER PROCESS EXITING 12613 1727096140.49070: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.49074: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.49077: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.49257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.49472: done with get_vars() 12613 1727096140.49488: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 08:55:40 -0400 (0:00:00.053) 0:00:04.133 ****** 12613 1727096140.49582: entering _queue_task() for managed_node1/include_tasks 12613 1727096140.49950: worker is 1 (out of 1 available) 12613 1727096140.49964: exiting _queue_task() for managed_node1/include_tasks 12613 1727096140.50032: done queuing things up, now waiting for results queue to drain 12613 1727096140.50034: waiting for pending results... 12613 1727096140.50242: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 12613 1727096140.50398: in run() - task 0afff68d-5257-a9dd-d073-000000000016 12613 1727096140.50461: variable 'ansible_search_path' from source: unknown 12613 1727096140.50464: variable 'ansible_search_path' from source: unknown 12613 1727096140.50470: calling self._execute() 12613 1727096140.50601: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.50612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.50629: variable 'omit' from source: magic vars 12613 1727096140.51242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096140.53882: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096140.53922: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096140.53964: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096140.54013: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096140.54044: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096140.54137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096140.54174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096140.54308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096140.54312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096140.54314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096140.54419: variable 'ansible_distribution' from source: facts 12613 1727096140.54434: variable 'ansible_distribution_major_version' from source: facts 12613 1727096140.54456: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096140.54463: when evaluation is False, skipping this task 12613 1727096140.54472: _execute() done 12613 1727096140.54478: dumping result to json 12613 1727096140.54485: done dumping result, returning 12613 1727096140.54496: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-a9dd-d073-000000000016] 12613 1727096140.54523: sending task result for task 0afff68d-5257-a9dd-d073-000000000016 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096140.54675: no more pending results, returning what we have 12613 1727096140.54679: results queue empty 12613 1727096140.54681: checking for any_errors_fatal 12613 1727096140.54688: done checking for any_errors_fatal 12613 1727096140.54688: checking for max_fail_percentage 12613 1727096140.54690: done checking for max_fail_percentage 12613 1727096140.54691: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.54691: done checking to see if all hosts have failed 12613 1727096140.54692: getting the remaining hosts for this loop 12613 1727096140.54694: done getting the remaining hosts for this loop 12613 1727096140.54698: getting the next task for host managed_node1 12613 1727096140.54705: done getting next task for host managed_node1 12613 1727096140.54708: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12613 1727096140.54711: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.54714: getting variables 12613 1727096140.54716: in VariableManager get_vars() 12613 1727096140.54772: Calling all_inventory to load vars for managed_node1 12613 1727096140.54775: Calling groups_inventory to load vars for managed_node1 12613 1727096140.54832: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.54844: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.54847: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.54850: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.55327: done sending task result for task 0afff68d-5257-a9dd-d073-000000000016 12613 1727096140.55331: WORKER PROCESS EXITING 12613 1727096140.55355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.55570: done with get_vars() 12613 1727096140.55582: done getting variables 12613 1727096140.55687: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 12613 1727096140.55818: variable 'interface' from source: task vars 12613 1727096140.55824: variable 'dhcp_interface1' from source: play vars 12613 1727096140.55891: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 08:55:40 -0400 (0:00:00.063) 0:00:04.196 ****** 12613 1727096140.55932: entering _queue_task() for managed_node1/assert 12613 1727096140.55933: Creating lock for assert 12613 1727096140.56265: worker is 1 (out of 1 available) 12613 1727096140.56401: exiting _queue_task() for managed_node1/assert 12613 1727096140.56412: done queuing things up, now waiting for results queue to drain 12613 1727096140.56414: waiting for pending results... 12613 1727096140.56615: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test1' 12613 1727096140.56845: in run() - task 0afff68d-5257-a9dd-d073-000000000017 12613 1727096140.56849: variable 'ansible_search_path' from source: unknown 12613 1727096140.56852: variable 'ansible_search_path' from source: unknown 12613 1727096140.56877: calling self._execute() 12613 1727096140.57062: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.57066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.57070: variable 'omit' from source: magic vars 12613 1727096140.57525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096140.59814: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096140.59863: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096140.59895: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096140.59922: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096140.59942: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096140.60007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096140.60040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096140.60058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096140.60086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096140.60101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096140.60198: variable 'ansible_distribution' from source: facts 12613 1727096140.60201: variable 'ansible_distribution_major_version' from source: facts 12613 1727096140.60219: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096140.60222: when evaluation is False, skipping this task 12613 1727096140.60227: _execute() done 12613 1727096140.60230: dumping result to json 12613 1727096140.60232: done dumping result, returning 12613 1727096140.60243: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test1' [0afff68d-5257-a9dd-d073-000000000017] 12613 1727096140.60245: sending task result for task 0afff68d-5257-a9dd-d073-000000000017 12613 1727096140.60329: done sending task result for task 0afff68d-5257-a9dd-d073-000000000017 12613 1727096140.60332: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096140.60388: no more pending results, returning what we have 12613 1727096140.60391: results queue empty 12613 1727096140.60392: checking for any_errors_fatal 12613 1727096140.60396: done checking for any_errors_fatal 12613 1727096140.60396: checking for max_fail_percentage 12613 1727096140.60398: done checking for max_fail_percentage 12613 1727096140.60399: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.60399: done checking to see if all hosts have failed 12613 1727096140.60400: getting the remaining hosts for this loop 12613 1727096140.60401: done getting the remaining hosts for this loop 12613 1727096140.60405: getting the next task for host managed_node1 12613 1727096140.60413: done getting next task for host managed_node1 12613 1727096140.60416: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12613 1727096140.60418: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.60422: getting variables 12613 1727096140.60423: in VariableManager get_vars() 12613 1727096140.60478: Calling all_inventory to load vars for managed_node1 12613 1727096140.60481: Calling groups_inventory to load vars for managed_node1 12613 1727096140.60483: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.60494: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.60496: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.60499: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.60642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.60792: done with get_vars() 12613 1727096140.60800: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 08:55:40 -0400 (0:00:00.049) 0:00:04.245 ****** 12613 1727096140.60864: entering _queue_task() for managed_node1/include_tasks 12613 1727096140.61145: worker is 1 (out of 1 available) 12613 1727096140.61158: exiting _queue_task() for managed_node1/include_tasks 12613 1727096140.61172: done queuing things up, now waiting for results queue to drain 12613 1727096140.61174: waiting for pending results... 12613 1727096140.61525: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 12613 1727096140.61549: in run() - task 0afff68d-5257-a9dd-d073-00000000001b 12613 1727096140.61576: variable 'ansible_search_path' from source: unknown 12613 1727096140.61584: variable 'ansible_search_path' from source: unknown 12613 1727096140.61634: calling self._execute() 12613 1727096140.61737: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.61749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.61771: variable 'omit' from source: magic vars 12613 1727096140.62331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096140.63873: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096140.63926: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096140.63958: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096140.63984: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096140.64010: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096140.64070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096140.64091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096140.64110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096140.64137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096140.64150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096140.64275: variable 'ansible_distribution' from source: facts 12613 1727096140.64279: variable 'ansible_distribution_major_version' from source: facts 12613 1727096140.64372: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096140.64375: when evaluation is False, skipping this task 12613 1727096140.64378: _execute() done 12613 1727096140.64380: dumping result to json 12613 1727096140.64382: done dumping result, returning 12613 1727096140.64384: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-a9dd-d073-00000000001b] 12613 1727096140.64386: sending task result for task 0afff68d-5257-a9dd-d073-00000000001b 12613 1727096140.64458: done sending task result for task 0afff68d-5257-a9dd-d073-00000000001b 12613 1727096140.64461: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096140.64508: no more pending results, returning what we have 12613 1727096140.64511: results queue empty 12613 1727096140.64512: checking for any_errors_fatal 12613 1727096140.64521: done checking for any_errors_fatal 12613 1727096140.64522: checking for max_fail_percentage 12613 1727096140.64523: done checking for max_fail_percentage 12613 1727096140.64524: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.64525: done checking to see if all hosts have failed 12613 1727096140.64526: getting the remaining hosts for this loop 12613 1727096140.64527: done getting the remaining hosts for this loop 12613 1727096140.64531: getting the next task for host managed_node1 12613 1727096140.64537: done getting next task for host managed_node1 12613 1727096140.64540: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12613 1727096140.64542: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.64546: getting variables 12613 1727096140.64547: in VariableManager get_vars() 12613 1727096140.64603: Calling all_inventory to load vars for managed_node1 12613 1727096140.64605: Calling groups_inventory to load vars for managed_node1 12613 1727096140.64607: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.64618: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.64620: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.64622: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.64909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.65112: done with get_vars() 12613 1727096140.65125: done getting variables 12613 1727096140.65186: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12613 1727096140.65315: variable 'interface' from source: task vars 12613 1727096140.65319: variable 'dhcp_interface2' from source: play vars 12613 1727096140.65382: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 08:55:40 -0400 (0:00:00.045) 0:00:04.291 ****** 12613 1727096140.65414: entering _queue_task() for managed_node1/assert 12613 1727096140.65714: worker is 1 (out of 1 available) 12613 1727096140.65726: exiting _queue_task() for managed_node1/assert 12613 1727096140.65738: done queuing things up, now waiting for results queue to drain 12613 1727096140.65739: waiting for pending results... 12613 1727096140.66027: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test2' 12613 1727096140.66102: in run() - task 0afff68d-5257-a9dd-d073-00000000001c 12613 1727096140.66110: variable 'ansible_search_path' from source: unknown 12613 1727096140.66114: variable 'ansible_search_path' from source: unknown 12613 1727096140.66153: calling self._execute() 12613 1727096140.66219: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.66225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.66236: variable 'omit' from source: magic vars 12613 1727096140.66555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096140.68379: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096140.68384: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096140.68419: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096140.68449: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096140.68480: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096140.68774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096140.68777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096140.68780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096140.68782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096140.68784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096140.68830: variable 'ansible_distribution' from source: facts 12613 1727096140.68842: variable 'ansible_distribution_major_version' from source: facts 12613 1727096140.68870: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096140.68879: when evaluation is False, skipping this task 12613 1727096140.68886: _execute() done 12613 1727096140.68893: dumping result to json 12613 1727096140.68902: done dumping result, returning 12613 1727096140.68914: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test2' [0afff68d-5257-a9dd-d073-00000000001c] 12613 1727096140.68924: sending task result for task 0afff68d-5257-a9dd-d073-00000000001c skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096140.69088: no more pending results, returning what we have 12613 1727096140.69092: results queue empty 12613 1727096140.69093: checking for any_errors_fatal 12613 1727096140.69098: done checking for any_errors_fatal 12613 1727096140.69098: checking for max_fail_percentage 12613 1727096140.69100: done checking for max_fail_percentage 12613 1727096140.69101: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.69101: done checking to see if all hosts have failed 12613 1727096140.69102: getting the remaining hosts for this loop 12613 1727096140.69103: done getting the remaining hosts for this loop 12613 1727096140.69107: getting the next task for host managed_node1 12613 1727096140.69116: done getting next task for host managed_node1 12613 1727096140.69119: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 12613 1727096140.69121: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.69125: getting variables 12613 1727096140.69127: in VariableManager get_vars() 12613 1727096140.69187: Calling all_inventory to load vars for managed_node1 12613 1727096140.69190: Calling groups_inventory to load vars for managed_node1 12613 1727096140.69192: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.69204: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.69206: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.69209: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.69603: done sending task result for task 0afff68d-5257-a9dd-d073-00000000001c 12613 1727096140.69607: WORKER PROCESS EXITING 12613 1727096140.69621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.69814: done with get_vars() 12613 1727096140.69825: done getting variables 12613 1727096140.69886: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:28 Monday 23 September 2024 08:55:40 -0400 (0:00:00.044) 0:00:04.336 ****** 12613 1727096140.69908: entering _queue_task() for managed_node1/command 12613 1727096140.70131: worker is 1 (out of 1 available) 12613 1727096140.70144: exiting _queue_task() for managed_node1/command 12613 1727096140.70154: done queuing things up, now waiting for results queue to drain 12613 1727096140.70155: waiting for pending results... 12613 1727096140.70327: running TaskExecutor() for managed_node1/TASK: Backup the /etc/resolv.conf for initscript 12613 1727096140.70388: in run() - task 0afff68d-5257-a9dd-d073-00000000001d 12613 1727096140.70397: variable 'ansible_search_path' from source: unknown 12613 1727096140.70426: calling self._execute() 12613 1727096140.70495: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.70499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.70509: variable 'omit' from source: magic vars 12613 1727096140.70821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096140.72692: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096140.72747: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096140.72770: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096140.72796: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096140.72816: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096140.72881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096140.72901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096140.72930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096140.72950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096140.72966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096140.73073: variable 'ansible_distribution' from source: facts 12613 1727096140.73078: variable 'ansible_distribution_major_version' from source: facts 12613 1727096140.73087: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096140.73090: when evaluation is False, skipping this task 12613 1727096140.73092: _execute() done 12613 1727096140.73095: dumping result to json 12613 1727096140.73099: done dumping result, returning 12613 1727096140.73106: done running TaskExecutor() for managed_node1/TASK: Backup the /etc/resolv.conf for initscript [0afff68d-5257-a9dd-d073-00000000001d] 12613 1727096140.73111: sending task result for task 0afff68d-5257-a9dd-d073-00000000001d 12613 1727096140.73212: done sending task result for task 0afff68d-5257-a9dd-d073-00000000001d 12613 1727096140.73215: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096140.73260: no more pending results, returning what we have 12613 1727096140.73263: results queue empty 12613 1727096140.73264: checking for any_errors_fatal 12613 1727096140.73280: done checking for any_errors_fatal 12613 1727096140.73280: checking for max_fail_percentage 12613 1727096140.73287: done checking for max_fail_percentage 12613 1727096140.73288: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.73289: done checking to see if all hosts have failed 12613 1727096140.73290: getting the remaining hosts for this loop 12613 1727096140.73291: done getting the remaining hosts for this loop 12613 1727096140.73295: getting the next task for host managed_node1 12613 1727096140.73301: done getting next task for host managed_node1 12613 1727096140.73304: ^ task is: TASK: TEST Add Bond with 2 ports 12613 1727096140.73306: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.73308: getting variables 12613 1727096140.73311: in VariableManager get_vars() 12613 1727096140.73362: Calling all_inventory to load vars for managed_node1 12613 1727096140.73365: Calling groups_inventory to load vars for managed_node1 12613 1727096140.73370: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.73380: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.73382: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.73388: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.73535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.73665: done with get_vars() 12613 1727096140.73676: done getting variables 12613 1727096140.73723: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:33 Monday 23 September 2024 08:55:40 -0400 (0:00:00.038) 0:00:04.374 ****** 12613 1727096140.73744: entering _queue_task() for managed_node1/debug 12613 1727096140.73971: worker is 1 (out of 1 available) 12613 1727096140.73984: exiting _queue_task() for managed_node1/debug 12613 1727096140.73996: done queuing things up, now waiting for results queue to drain 12613 1727096140.73998: waiting for pending results... 12613 1727096140.74162: running TaskExecutor() for managed_node1/TASK: TEST Add Bond with 2 ports 12613 1727096140.74224: in run() - task 0afff68d-5257-a9dd-d073-00000000001e 12613 1727096140.74240: variable 'ansible_search_path' from source: unknown 12613 1727096140.74270: calling self._execute() 12613 1727096140.74333: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.74339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.74353: variable 'omit' from source: magic vars 12613 1727096140.74678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096140.76271: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096140.76319: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096140.76346: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096140.76376: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096140.76395: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096140.76461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096140.76483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096140.76500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096140.76527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096140.76544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096140.76648: variable 'ansible_distribution' from source: facts 12613 1727096140.76656: variable 'ansible_distribution_major_version' from source: facts 12613 1727096140.76671: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096140.76674: when evaluation is False, skipping this task 12613 1727096140.76677: _execute() done 12613 1727096140.76679: dumping result to json 12613 1727096140.76683: done dumping result, returning 12613 1727096140.76690: done running TaskExecutor() for managed_node1/TASK: TEST Add Bond with 2 ports [0afff68d-5257-a9dd-d073-00000000001e] 12613 1727096140.76694: sending task result for task 0afff68d-5257-a9dd-d073-00000000001e 12613 1727096140.76783: done sending task result for task 0afff68d-5257-a9dd-d073-00000000001e 12613 1727096140.76785: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096140.76826: no more pending results, returning what we have 12613 1727096140.76829: results queue empty 12613 1727096140.76830: checking for any_errors_fatal 12613 1727096140.76835: done checking for any_errors_fatal 12613 1727096140.76836: checking for max_fail_percentage 12613 1727096140.76838: done checking for max_fail_percentage 12613 1727096140.76838: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.76839: done checking to see if all hosts have failed 12613 1727096140.76840: getting the remaining hosts for this loop 12613 1727096140.76841: done getting the remaining hosts for this loop 12613 1727096140.76845: getting the next task for host managed_node1 12613 1727096140.76854: done getting next task for host managed_node1 12613 1727096140.76859: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12613 1727096140.76862: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.76878: getting variables 12613 1727096140.76879: in VariableManager get_vars() 12613 1727096140.76932: Calling all_inventory to load vars for managed_node1 12613 1727096140.76935: Calling groups_inventory to load vars for managed_node1 12613 1727096140.76937: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.76946: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.76949: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.76954: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.77146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.77276: done with get_vars() 12613 1727096140.77286: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:55:40 -0400 (0:00:00.036) 0:00:04.410 ****** 12613 1727096140.77357: entering _queue_task() for managed_node1/include_tasks 12613 1727096140.77598: worker is 1 (out of 1 available) 12613 1727096140.77611: exiting _queue_task() for managed_node1/include_tasks 12613 1727096140.77623: done queuing things up, now waiting for results queue to drain 12613 1727096140.77624: waiting for pending results... 12613 1727096140.77795: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12613 1727096140.77885: in run() - task 0afff68d-5257-a9dd-d073-000000000026 12613 1727096140.77896: variable 'ansible_search_path' from source: unknown 12613 1727096140.77899: variable 'ansible_search_path' from source: unknown 12613 1727096140.77928: calling self._execute() 12613 1727096140.77995: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.77999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.78008: variable 'omit' from source: magic vars 12613 1727096140.78325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096140.79893: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096140.80152: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096140.80186: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096140.80210: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096140.80231: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096140.80298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096140.80319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096140.80336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096140.80366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096140.80388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096140.80486: variable 'ansible_distribution' from source: facts 12613 1727096140.80490: variable 'ansible_distribution_major_version' from source: facts 12613 1727096140.80505: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096140.80508: when evaluation is False, skipping this task 12613 1727096140.80510: _execute() done 12613 1727096140.80512: dumping result to json 12613 1727096140.80515: done dumping result, returning 12613 1727096140.80523: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-a9dd-d073-000000000026] 12613 1727096140.80527: sending task result for task 0afff68d-5257-a9dd-d073-000000000026 12613 1727096140.80615: done sending task result for task 0afff68d-5257-a9dd-d073-000000000026 12613 1727096140.80618: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096140.80686: no more pending results, returning what we have 12613 1727096140.80690: results queue empty 12613 1727096140.80690: checking for any_errors_fatal 12613 1727096140.80695: done checking for any_errors_fatal 12613 1727096140.80696: checking for max_fail_percentage 12613 1727096140.80697: done checking for max_fail_percentage 12613 1727096140.80698: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.80699: done checking to see if all hosts have failed 12613 1727096140.80700: getting the remaining hosts for this loop 12613 1727096140.80701: done getting the remaining hosts for this loop 12613 1727096140.80705: getting the next task for host managed_node1 12613 1727096140.80712: done getting next task for host managed_node1 12613 1727096140.80717: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12613 1727096140.80719: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.80738: getting variables 12613 1727096140.80739: in VariableManager get_vars() 12613 1727096140.80797: Calling all_inventory to load vars for managed_node1 12613 1727096140.80799: Calling groups_inventory to load vars for managed_node1 12613 1727096140.80802: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.80809: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.80812: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.80815: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.80950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.81082: done with get_vars() 12613 1727096140.81094: done getting variables 12613 1727096140.81154: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:55:40 -0400 (0:00:00.038) 0:00:04.449 ****** 12613 1727096140.81190: entering _queue_task() for managed_node1/debug 12613 1727096140.81505: worker is 1 (out of 1 available) 12613 1727096140.81519: exiting _queue_task() for managed_node1/debug 12613 1727096140.81530: done queuing things up, now waiting for results queue to drain 12613 1727096140.81532: waiting for pending results... 12613 1727096140.81990: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 12613 1727096140.81996: in run() - task 0afff68d-5257-a9dd-d073-000000000027 12613 1727096140.81999: variable 'ansible_search_path' from source: unknown 12613 1727096140.82002: variable 'ansible_search_path' from source: unknown 12613 1727096140.82119: calling self._execute() 12613 1727096140.82144: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.82161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.82183: variable 'omit' from source: magic vars 12613 1727096140.82687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096140.85448: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096140.85535: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096140.85586: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096140.85673: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096140.85677: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096140.85769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096140.85807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096140.85846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096140.85898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096140.85974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096140.86087: variable 'ansible_distribution' from source: facts 12613 1727096140.86101: variable 'ansible_distribution_major_version' from source: facts 12613 1727096140.86125: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096140.86135: when evaluation is False, skipping this task 12613 1727096140.86148: _execute() done 12613 1727096140.86160: dumping result to json 12613 1727096140.86172: done dumping result, returning 12613 1727096140.86187: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-a9dd-d073-000000000027] 12613 1727096140.86260: sending task result for task 0afff68d-5257-a9dd-d073-000000000027 12613 1727096140.86330: done sending task result for task 0afff68d-5257-a9dd-d073-000000000027 12613 1727096140.86333: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096140.86412: no more pending results, returning what we have 12613 1727096140.86416: results queue empty 12613 1727096140.86417: checking for any_errors_fatal 12613 1727096140.86422: done checking for any_errors_fatal 12613 1727096140.86423: checking for max_fail_percentage 12613 1727096140.86424: done checking for max_fail_percentage 12613 1727096140.86425: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.86426: done checking to see if all hosts have failed 12613 1727096140.86427: getting the remaining hosts for this loop 12613 1727096140.86428: done getting the remaining hosts for this loop 12613 1727096140.86432: getting the next task for host managed_node1 12613 1727096140.86438: done getting next task for host managed_node1 12613 1727096140.86442: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12613 1727096140.86445: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.86462: getting variables 12613 1727096140.86464: in VariableManager get_vars() 12613 1727096140.86524: Calling all_inventory to load vars for managed_node1 12613 1727096140.86528: Calling groups_inventory to load vars for managed_node1 12613 1727096140.86530: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.86540: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.86543: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.86546: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.86760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.86912: done with get_vars() 12613 1727096140.86920: done getting variables 12613 1727096140.86992: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:55:40 -0400 (0:00:00.058) 0:00:04.507 ****** 12613 1727096140.87015: entering _queue_task() for managed_node1/fail 12613 1727096140.87017: Creating lock for fail 12613 1727096140.87255: worker is 1 (out of 1 available) 12613 1727096140.87271: exiting _queue_task() for managed_node1/fail 12613 1727096140.87282: done queuing things up, now waiting for results queue to drain 12613 1727096140.87284: waiting for pending results... 12613 1727096140.87446: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12613 1727096140.87535: in run() - task 0afff68d-5257-a9dd-d073-000000000028 12613 1727096140.87545: variable 'ansible_search_path' from source: unknown 12613 1727096140.87549: variable 'ansible_search_path' from source: unknown 12613 1727096140.87581: calling self._execute() 12613 1727096140.87640: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.87643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.87656: variable 'omit' from source: magic vars 12613 1727096140.87963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096140.89730: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096140.89776: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096140.89805: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096140.89844: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096140.89864: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096140.89927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096140.89953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096140.89973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096140.89999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096140.90010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096140.90117: variable 'ansible_distribution' from source: facts 12613 1727096140.90120: variable 'ansible_distribution_major_version' from source: facts 12613 1727096140.90146: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096140.90149: when evaluation is False, skipping this task 12613 1727096140.90151: _execute() done 12613 1727096140.90154: dumping result to json 12613 1727096140.90156: done dumping result, returning 12613 1727096140.90159: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-a9dd-d073-000000000028] 12613 1727096140.90161: sending task result for task 0afff68d-5257-a9dd-d073-000000000028 12613 1727096140.90270: done sending task result for task 0afff68d-5257-a9dd-d073-000000000028 12613 1727096140.90273: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096140.90317: no more pending results, returning what we have 12613 1727096140.90320: results queue empty 12613 1727096140.90321: checking for any_errors_fatal 12613 1727096140.90326: done checking for any_errors_fatal 12613 1727096140.90326: checking for max_fail_percentage 12613 1727096140.90328: done checking for max_fail_percentage 12613 1727096140.90329: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.90330: done checking to see if all hosts have failed 12613 1727096140.90331: getting the remaining hosts for this loop 12613 1727096140.90332: done getting the remaining hosts for this loop 12613 1727096140.90335: getting the next task for host managed_node1 12613 1727096140.90341: done getting next task for host managed_node1 12613 1727096140.90344: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12613 1727096140.90347: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.90360: getting variables 12613 1727096140.90362: in VariableManager get_vars() 12613 1727096140.90417: Calling all_inventory to load vars for managed_node1 12613 1727096140.90420: Calling groups_inventory to load vars for managed_node1 12613 1727096140.90422: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.90433: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.90436: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.90439: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.90634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.90847: done with get_vars() 12613 1727096140.90859: done getting variables 12613 1727096140.90936: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:55:40 -0400 (0:00:00.039) 0:00:04.546 ****** 12613 1727096140.90975: entering _queue_task() for managed_node1/fail 12613 1727096140.91290: worker is 1 (out of 1 available) 12613 1727096140.91303: exiting _queue_task() for managed_node1/fail 12613 1727096140.91315: done queuing things up, now waiting for results queue to drain 12613 1727096140.91317: waiting for pending results... 12613 1727096140.91594: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12613 1727096140.91687: in run() - task 0afff68d-5257-a9dd-d073-000000000029 12613 1727096140.91745: variable 'ansible_search_path' from source: unknown 12613 1727096140.91748: variable 'ansible_search_path' from source: unknown 12613 1727096140.91754: calling self._execute() 12613 1727096140.91824: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.91831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.91838: variable 'omit' from source: magic vars 12613 1727096140.92160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096140.94075: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096140.94080: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096140.94098: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096140.94138: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096140.94175: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096140.94256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096140.94291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096140.94316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096140.94352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096140.94371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096140.94518: variable 'ansible_distribution' from source: facts 12613 1727096140.94528: variable 'ansible_distribution_major_version' from source: facts 12613 1727096140.94548: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096140.94554: when evaluation is False, skipping this task 12613 1727096140.94560: _execute() done 12613 1727096140.94565: dumping result to json 12613 1727096140.94574: done dumping result, returning 12613 1727096140.94585: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-a9dd-d073-000000000029] 12613 1727096140.94594: sending task result for task 0afff68d-5257-a9dd-d073-000000000029 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096140.94758: no more pending results, returning what we have 12613 1727096140.94762: results queue empty 12613 1727096140.94763: checking for any_errors_fatal 12613 1727096140.94772: done checking for any_errors_fatal 12613 1727096140.94772: checking for max_fail_percentage 12613 1727096140.94774: done checking for max_fail_percentage 12613 1727096140.94775: checking to see if all hosts have failed and the running result is not ok 12613 1727096140.94776: done checking to see if all hosts have failed 12613 1727096140.94777: getting the remaining hosts for this loop 12613 1727096140.94778: done getting the remaining hosts for this loop 12613 1727096140.94781: getting the next task for host managed_node1 12613 1727096140.94787: done getting next task for host managed_node1 12613 1727096140.94791: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12613 1727096140.94793: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096140.94807: getting variables 12613 1727096140.94809: in VariableManager get_vars() 12613 1727096140.94975: Calling all_inventory to load vars for managed_node1 12613 1727096140.94978: Calling groups_inventory to load vars for managed_node1 12613 1727096140.94980: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096140.95009: Calling all_plugins_play to load vars for managed_node1 12613 1727096140.95012: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096140.95015: Calling groups_plugins_play to load vars for managed_node1 12613 1727096140.95230: done sending task result for task 0afff68d-5257-a9dd-d073-000000000029 12613 1727096140.95233: WORKER PROCESS EXITING 12613 1727096140.95257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096140.95485: done with get_vars() 12613 1727096140.95496: done getting variables 12613 1727096140.95555: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:55:40 -0400 (0:00:00.046) 0:00:04.593 ****** 12613 1727096140.95595: entering _queue_task() for managed_node1/fail 12613 1727096140.95910: worker is 1 (out of 1 available) 12613 1727096140.95924: exiting _queue_task() for managed_node1/fail 12613 1727096140.95935: done queuing things up, now waiting for results queue to drain 12613 1727096140.95937: waiting for pending results... 12613 1727096140.96285: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12613 1727096140.96289: in run() - task 0afff68d-5257-a9dd-d073-00000000002a 12613 1727096140.96292: variable 'ansible_search_path' from source: unknown 12613 1727096140.96294: variable 'ansible_search_path' from source: unknown 12613 1727096140.96308: calling self._execute() 12613 1727096140.96386: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096140.96397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096140.96411: variable 'omit' from source: magic vars 12613 1727096140.96821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096140.99110: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096140.99187: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096140.99246: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096140.99290: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096140.99325: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096140.99418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096140.99456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096140.99503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096140.99554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096140.99577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096140.99716: variable 'ansible_distribution' from source: facts 12613 1727096140.99728: variable 'ansible_distribution_major_version' from source: facts 12613 1727096140.99755: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096140.99846: when evaluation is False, skipping this task 12613 1727096140.99849: _execute() done 12613 1727096140.99851: dumping result to json 12613 1727096140.99853: done dumping result, returning 12613 1727096140.99855: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-a9dd-d073-00000000002a] 12613 1727096140.99858: sending task result for task 0afff68d-5257-a9dd-d073-00000000002a 12613 1727096140.99930: done sending task result for task 0afff68d-5257-a9dd-d073-00000000002a 12613 1727096140.99934: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096140.99999: no more pending results, returning what we have 12613 1727096141.00002: results queue empty 12613 1727096141.00003: checking for any_errors_fatal 12613 1727096141.00009: done checking for any_errors_fatal 12613 1727096141.00010: checking for max_fail_percentage 12613 1727096141.00012: done checking for max_fail_percentage 12613 1727096141.00013: checking to see if all hosts have failed and the running result is not ok 12613 1727096141.00014: done checking to see if all hosts have failed 12613 1727096141.00015: getting the remaining hosts for this loop 12613 1727096141.00016: done getting the remaining hosts for this loop 12613 1727096141.00020: getting the next task for host managed_node1 12613 1727096141.00027: done getting next task for host managed_node1 12613 1727096141.00031: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12613 1727096141.00033: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096141.00047: getting variables 12613 1727096141.00049: in VariableManager get_vars() 12613 1727096141.00109: Calling all_inventory to load vars for managed_node1 12613 1727096141.00112: Calling groups_inventory to load vars for managed_node1 12613 1727096141.00115: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096141.00127: Calling all_plugins_play to load vars for managed_node1 12613 1727096141.00130: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096141.00133: Calling groups_plugins_play to load vars for managed_node1 12613 1727096141.00502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096141.00770: done with get_vars() 12613 1727096141.00782: done getting variables 12613 1727096141.00876: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:55:41 -0400 (0:00:00.053) 0:00:04.646 ****** 12613 1727096141.00909: entering _queue_task() for managed_node1/dnf 12613 1727096141.01190: worker is 1 (out of 1 available) 12613 1727096141.01201: exiting _queue_task() for managed_node1/dnf 12613 1727096141.01213: done queuing things up, now waiting for results queue to drain 12613 1727096141.01214: waiting for pending results... 12613 1727096141.01478: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12613 1727096141.01613: in run() - task 0afff68d-5257-a9dd-d073-00000000002b 12613 1727096141.01636: variable 'ansible_search_path' from source: unknown 12613 1727096141.01645: variable 'ansible_search_path' from source: unknown 12613 1727096141.01693: calling self._execute() 12613 1727096141.01781: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096141.01796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096141.01811: variable 'omit' from source: magic vars 12613 1727096141.02663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096141.04698: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096141.04781: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096141.04828: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096141.04883: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096141.04915: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096141.05010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096141.05051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096141.05084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096141.05129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096141.05154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096141.05299: variable 'ansible_distribution' from source: facts 12613 1727096141.05310: variable 'ansible_distribution_major_version' from source: facts 12613 1727096141.05334: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096141.05370: when evaluation is False, skipping this task 12613 1727096141.05374: _execute() done 12613 1727096141.05376: dumping result to json 12613 1727096141.05379: done dumping result, returning 12613 1727096141.05382: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-a9dd-d073-00000000002b] 12613 1727096141.05384: sending task result for task 0afff68d-5257-a9dd-d073-00000000002b skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096141.05542: no more pending results, returning what we have 12613 1727096141.05547: results queue empty 12613 1727096141.05548: checking for any_errors_fatal 12613 1727096141.05553: done checking for any_errors_fatal 12613 1727096141.05553: checking for max_fail_percentage 12613 1727096141.05555: done checking for max_fail_percentage 12613 1727096141.05556: checking to see if all hosts have failed and the running result is not ok 12613 1727096141.05557: done checking to see if all hosts have failed 12613 1727096141.05557: getting the remaining hosts for this loop 12613 1727096141.05559: done getting the remaining hosts for this loop 12613 1727096141.05563: getting the next task for host managed_node1 12613 1727096141.05571: done getting next task for host managed_node1 12613 1727096141.05575: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12613 1727096141.05578: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096141.05593: getting variables 12613 1727096141.05595: in VariableManager get_vars() 12613 1727096141.05650: Calling all_inventory to load vars for managed_node1 12613 1727096141.05653: Calling groups_inventory to load vars for managed_node1 12613 1727096141.05655: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096141.05666: Calling all_plugins_play to load vars for managed_node1 12613 1727096141.05973: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096141.05978: Calling groups_plugins_play to load vars for managed_node1 12613 1727096141.06401: done sending task result for task 0afff68d-5257-a9dd-d073-00000000002b 12613 1727096141.06404: WORKER PROCESS EXITING 12613 1727096141.06426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096141.06619: done with get_vars() 12613 1727096141.06631: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12613 1727096141.06704: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:55:41 -0400 (0:00:00.058) 0:00:04.704 ****** 12613 1727096141.06735: entering _queue_task() for managed_node1/yum 12613 1727096141.06736: Creating lock for yum 12613 1727096141.07137: worker is 1 (out of 1 available) 12613 1727096141.07148: exiting _queue_task() for managed_node1/yum 12613 1727096141.07163: done queuing things up, now waiting for results queue to drain 12613 1727096141.07165: waiting for pending results... 12613 1727096141.07429: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12613 1727096141.07576: in run() - task 0afff68d-5257-a9dd-d073-00000000002c 12613 1727096141.07592: variable 'ansible_search_path' from source: unknown 12613 1727096141.07595: variable 'ansible_search_path' from source: unknown 12613 1727096141.07624: calling self._execute() 12613 1727096141.07693: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096141.07698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096141.07707: variable 'omit' from source: magic vars 12613 1727096141.08016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096141.09775: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096141.09780: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096141.09784: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096141.09824: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096141.09855: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096141.09943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096141.09982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096141.10013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096141.10056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096141.10077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096141.10210: variable 'ansible_distribution' from source: facts 12613 1727096141.10222: variable 'ansible_distribution_major_version' from source: facts 12613 1727096141.10246: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096141.10255: when evaluation is False, skipping this task 12613 1727096141.10266: _execute() done 12613 1727096141.10289: dumping result to json 12613 1727096141.10371: done dumping result, returning 12613 1727096141.10375: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-a9dd-d073-00000000002c] 12613 1727096141.10378: sending task result for task 0afff68d-5257-a9dd-d073-00000000002c 12613 1727096141.10465: done sending task result for task 0afff68d-5257-a9dd-d073-00000000002c skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096141.10518: no more pending results, returning what we have 12613 1727096141.10521: results queue empty 12613 1727096141.10521: checking for any_errors_fatal 12613 1727096141.10528: done checking for any_errors_fatal 12613 1727096141.10529: checking for max_fail_percentage 12613 1727096141.10530: done checking for max_fail_percentage 12613 1727096141.10531: checking to see if all hosts have failed and the running result is not ok 12613 1727096141.10532: done checking to see if all hosts have failed 12613 1727096141.10532: getting the remaining hosts for this loop 12613 1727096141.10534: done getting the remaining hosts for this loop 12613 1727096141.10537: getting the next task for host managed_node1 12613 1727096141.10543: done getting next task for host managed_node1 12613 1727096141.10547: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12613 1727096141.10549: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096141.10563: getting variables 12613 1727096141.10564: in VariableManager get_vars() 12613 1727096141.10620: Calling all_inventory to load vars for managed_node1 12613 1727096141.10623: Calling groups_inventory to load vars for managed_node1 12613 1727096141.10625: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096141.10635: Calling all_plugins_play to load vars for managed_node1 12613 1727096141.10638: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096141.10640: Calling groups_plugins_play to load vars for managed_node1 12613 1727096141.10790: WORKER PROCESS EXITING 12613 1727096141.10811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096141.10938: done with get_vars() 12613 1727096141.10946: done getting variables 12613 1727096141.10989: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:55:41 -0400 (0:00:00.042) 0:00:04.747 ****** 12613 1727096141.11014: entering _queue_task() for managed_node1/fail 12613 1727096141.11229: worker is 1 (out of 1 available) 12613 1727096141.11242: exiting _queue_task() for managed_node1/fail 12613 1727096141.11253: done queuing things up, now waiting for results queue to drain 12613 1727096141.11255: waiting for pending results... 12613 1727096141.11426: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12613 1727096141.11524: in run() - task 0afff68d-5257-a9dd-d073-00000000002d 12613 1727096141.11535: variable 'ansible_search_path' from source: unknown 12613 1727096141.11539: variable 'ansible_search_path' from source: unknown 12613 1727096141.11570: calling self._execute() 12613 1727096141.11632: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096141.11636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096141.11646: variable 'omit' from source: magic vars 12613 1727096141.12013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096141.14145: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096141.14211: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096141.14239: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096141.14266: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096141.14288: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096141.14354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096141.14376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096141.14394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096141.14419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096141.14432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096141.14537: variable 'ansible_distribution' from source: facts 12613 1727096141.14543: variable 'ansible_distribution_major_version' from source: facts 12613 1727096141.14650: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096141.14657: when evaluation is False, skipping this task 12613 1727096141.14658: _execute() done 12613 1727096141.14660: dumping result to json 12613 1727096141.14662: done dumping result, returning 12613 1727096141.14664: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-a9dd-d073-00000000002d] 12613 1727096141.14665: sending task result for task 0afff68d-5257-a9dd-d073-00000000002d 12613 1727096141.14734: done sending task result for task 0afff68d-5257-a9dd-d073-00000000002d 12613 1727096141.14737: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096141.14793: no more pending results, returning what we have 12613 1727096141.14797: results queue empty 12613 1727096141.14798: checking for any_errors_fatal 12613 1727096141.14802: done checking for any_errors_fatal 12613 1727096141.14803: checking for max_fail_percentage 12613 1727096141.14804: done checking for max_fail_percentage 12613 1727096141.14805: checking to see if all hosts have failed and the running result is not ok 12613 1727096141.14805: done checking to see if all hosts have failed 12613 1727096141.14806: getting the remaining hosts for this loop 12613 1727096141.14807: done getting the remaining hosts for this loop 12613 1727096141.14810: getting the next task for host managed_node1 12613 1727096141.14816: done getting next task for host managed_node1 12613 1727096141.14819: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12613 1727096141.14822: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096141.14834: getting variables 12613 1727096141.14835: in VariableManager get_vars() 12613 1727096141.14889: Calling all_inventory to load vars for managed_node1 12613 1727096141.14891: Calling groups_inventory to load vars for managed_node1 12613 1727096141.14892: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096141.14899: Calling all_plugins_play to load vars for managed_node1 12613 1727096141.14901: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096141.14902: Calling groups_plugins_play to load vars for managed_node1 12613 1727096141.15055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096141.15180: done with get_vars() 12613 1727096141.15190: done getting variables 12613 1727096141.15232: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:55:41 -0400 (0:00:00.042) 0:00:04.789 ****** 12613 1727096141.15257: entering _queue_task() for managed_node1/package 12613 1727096141.15485: worker is 1 (out of 1 available) 12613 1727096141.15498: exiting _queue_task() for managed_node1/package 12613 1727096141.15510: done queuing things up, now waiting for results queue to drain 12613 1727096141.15511: waiting for pending results... 12613 1727096141.15732: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 12613 1727096141.15829: in run() - task 0afff68d-5257-a9dd-d073-00000000002e 12613 1727096141.15837: variable 'ansible_search_path' from source: unknown 12613 1727096141.15972: variable 'ansible_search_path' from source: unknown 12613 1727096141.15976: calling self._execute() 12613 1727096141.15978: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096141.15980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096141.15996: variable 'omit' from source: magic vars 12613 1727096141.16428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096141.18327: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096141.18380: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096141.18406: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096141.18432: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096141.18458: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096141.18517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096141.18538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096141.18557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096141.18588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096141.18600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096141.18700: variable 'ansible_distribution' from source: facts 12613 1727096141.18705: variable 'ansible_distribution_major_version' from source: facts 12613 1727096141.18720: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096141.18723: when evaluation is False, skipping this task 12613 1727096141.18726: _execute() done 12613 1727096141.18728: dumping result to json 12613 1727096141.18730: done dumping result, returning 12613 1727096141.18738: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-a9dd-d073-00000000002e] 12613 1727096141.18742: sending task result for task 0afff68d-5257-a9dd-d073-00000000002e 12613 1727096141.18835: done sending task result for task 0afff68d-5257-a9dd-d073-00000000002e 12613 1727096141.18837: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096141.18890: no more pending results, returning what we have 12613 1727096141.18893: results queue empty 12613 1727096141.18894: checking for any_errors_fatal 12613 1727096141.18903: done checking for any_errors_fatal 12613 1727096141.18904: checking for max_fail_percentage 12613 1727096141.18906: done checking for max_fail_percentage 12613 1727096141.18907: checking to see if all hosts have failed and the running result is not ok 12613 1727096141.18907: done checking to see if all hosts have failed 12613 1727096141.18908: getting the remaining hosts for this loop 12613 1727096141.18909: done getting the remaining hosts for this loop 12613 1727096141.18912: getting the next task for host managed_node1 12613 1727096141.18918: done getting next task for host managed_node1 12613 1727096141.18922: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12613 1727096141.18924: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096141.18937: getting variables 12613 1727096141.18938: in VariableManager get_vars() 12613 1727096141.18999: Calling all_inventory to load vars for managed_node1 12613 1727096141.19002: Calling groups_inventory to load vars for managed_node1 12613 1727096141.19004: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096141.19012: Calling all_plugins_play to load vars for managed_node1 12613 1727096141.19014: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096141.19016: Calling groups_plugins_play to load vars for managed_node1 12613 1727096141.19148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096141.19306: done with get_vars() 12613 1727096141.19314: done getting variables 12613 1727096141.19354: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:55:41 -0400 (0:00:00.041) 0:00:04.830 ****** 12613 1727096141.19380: entering _queue_task() for managed_node1/package 12613 1727096141.19590: worker is 1 (out of 1 available) 12613 1727096141.19603: exiting _queue_task() for managed_node1/package 12613 1727096141.19615: done queuing things up, now waiting for results queue to drain 12613 1727096141.19616: waiting for pending results... 12613 1727096141.20091: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12613 1727096141.20100: in run() - task 0afff68d-5257-a9dd-d073-00000000002f 12613 1727096141.20105: variable 'ansible_search_path' from source: unknown 12613 1727096141.20109: variable 'ansible_search_path' from source: unknown 12613 1727096141.20146: calling self._execute() 12613 1727096141.20243: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096141.20257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096141.20277: variable 'omit' from source: magic vars 12613 1727096141.20757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096141.22288: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096141.22342: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096141.22373: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096141.22401: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096141.22421: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096141.22483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096141.22504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096141.22523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096141.22548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096141.22561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096141.22659: variable 'ansible_distribution' from source: facts 12613 1727096141.22664: variable 'ansible_distribution_major_version' from source: facts 12613 1727096141.22682: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096141.22688: when evaluation is False, skipping this task 12613 1727096141.22690: _execute() done 12613 1727096141.22692: dumping result to json 12613 1727096141.22695: done dumping result, returning 12613 1727096141.22697: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-a9dd-d073-00000000002f] 12613 1727096141.22705: sending task result for task 0afff68d-5257-a9dd-d073-00000000002f skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096141.22843: no more pending results, returning what we have 12613 1727096141.22847: results queue empty 12613 1727096141.22848: checking for any_errors_fatal 12613 1727096141.22856: done checking for any_errors_fatal 12613 1727096141.22856: checking for max_fail_percentage 12613 1727096141.22858: done checking for max_fail_percentage 12613 1727096141.22859: checking to see if all hosts have failed and the running result is not ok 12613 1727096141.22860: done checking to see if all hosts have failed 12613 1727096141.22860: getting the remaining hosts for this loop 12613 1727096141.22862: done getting the remaining hosts for this loop 12613 1727096141.22865: getting the next task for host managed_node1 12613 1727096141.22873: done getting next task for host managed_node1 12613 1727096141.22877: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12613 1727096141.22880: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096141.22893: getting variables 12613 1727096141.22895: in VariableManager get_vars() 12613 1727096141.22947: Calling all_inventory to load vars for managed_node1 12613 1727096141.22949: Calling groups_inventory to load vars for managed_node1 12613 1727096141.22954: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096141.22962: Calling all_plugins_play to load vars for managed_node1 12613 1727096141.22964: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096141.22976: Calling groups_plugins_play to load vars for managed_node1 12613 1727096141.22989: done sending task result for task 0afff68d-5257-a9dd-d073-00000000002f 12613 1727096141.22991: WORKER PROCESS EXITING 12613 1727096141.23120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096141.23248: done with get_vars() 12613 1727096141.23257: done getting variables 12613 1727096141.23301: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:55:41 -0400 (0:00:00.039) 0:00:04.870 ****** 12613 1727096141.23325: entering _queue_task() for managed_node1/package 12613 1727096141.23546: worker is 1 (out of 1 available) 12613 1727096141.23559: exiting _queue_task() for managed_node1/package 12613 1727096141.23573: done queuing things up, now waiting for results queue to drain 12613 1727096141.23575: waiting for pending results... 12613 1727096141.23754: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12613 1727096141.23841: in run() - task 0afff68d-5257-a9dd-d073-000000000030 12613 1727096141.23855: variable 'ansible_search_path' from source: unknown 12613 1727096141.23858: variable 'ansible_search_path' from source: unknown 12613 1727096141.23887: calling self._execute() 12613 1727096141.23948: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096141.23955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096141.23963: variable 'omit' from source: magic vars 12613 1727096141.24269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096141.25880: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096141.25927: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096141.25957: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096141.25986: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096141.26007: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096141.26068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096141.26094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096141.26112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096141.26138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096141.26149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096141.26250: variable 'ansible_distribution' from source: facts 12613 1727096141.26257: variable 'ansible_distribution_major_version' from source: facts 12613 1727096141.26271: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096141.26274: when evaluation is False, skipping this task 12613 1727096141.26276: _execute() done 12613 1727096141.26279: dumping result to json 12613 1727096141.26281: done dumping result, returning 12613 1727096141.26289: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-a9dd-d073-000000000030] 12613 1727096141.26295: sending task result for task 0afff68d-5257-a9dd-d073-000000000030 12613 1727096141.26391: done sending task result for task 0afff68d-5257-a9dd-d073-000000000030 12613 1727096141.26394: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096141.26477: no more pending results, returning what we have 12613 1727096141.26481: results queue empty 12613 1727096141.26483: checking for any_errors_fatal 12613 1727096141.26489: done checking for any_errors_fatal 12613 1727096141.26489: checking for max_fail_percentage 12613 1727096141.26491: done checking for max_fail_percentage 12613 1727096141.26492: checking to see if all hosts have failed and the running result is not ok 12613 1727096141.26493: done checking to see if all hosts have failed 12613 1727096141.26493: getting the remaining hosts for this loop 12613 1727096141.26494: done getting the remaining hosts for this loop 12613 1727096141.26498: getting the next task for host managed_node1 12613 1727096141.26504: done getting next task for host managed_node1 12613 1727096141.26508: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12613 1727096141.26510: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096141.26525: getting variables 12613 1727096141.26527: in VariableManager get_vars() 12613 1727096141.26579: Calling all_inventory to load vars for managed_node1 12613 1727096141.26582: Calling groups_inventory to load vars for managed_node1 12613 1727096141.26584: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096141.26592: Calling all_plugins_play to load vars for managed_node1 12613 1727096141.26594: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096141.26596: Calling groups_plugins_play to load vars for managed_node1 12613 1727096141.26764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096141.26898: done with get_vars() 12613 1727096141.26906: done getting variables 12613 1727096141.26977: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:55:41 -0400 (0:00:00.036) 0:00:04.907 ****** 12613 1727096141.27000: entering _queue_task() for managed_node1/service 12613 1727096141.27002: Creating lock for service 12613 1727096141.27228: worker is 1 (out of 1 available) 12613 1727096141.27241: exiting _queue_task() for managed_node1/service 12613 1727096141.27253: done queuing things up, now waiting for results queue to drain 12613 1727096141.27254: waiting for pending results... 12613 1727096141.27574: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12613 1727096141.27670: in run() - task 0afff68d-5257-a9dd-d073-000000000031 12613 1727096141.27692: variable 'ansible_search_path' from source: unknown 12613 1727096141.27700: variable 'ansible_search_path' from source: unknown 12613 1727096141.27738: calling self._execute() 12613 1727096141.27834: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096141.27845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096141.27991: variable 'omit' from source: magic vars 12613 1727096141.28470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096141.30778: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096141.30832: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096141.30864: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096141.30891: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096141.30913: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096141.30978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096141.30999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096141.31015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096141.31042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096141.31056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096141.31154: variable 'ansible_distribution' from source: facts 12613 1727096141.31158: variable 'ansible_distribution_major_version' from source: facts 12613 1727096141.31181: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096141.31184: when evaluation is False, skipping this task 12613 1727096141.31186: _execute() done 12613 1727096141.31188: dumping result to json 12613 1727096141.31191: done dumping result, returning 12613 1727096141.31193: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-a9dd-d073-000000000031] 12613 1727096141.31195: sending task result for task 0afff68d-5257-a9dd-d073-000000000031 12613 1727096141.31290: done sending task result for task 0afff68d-5257-a9dd-d073-000000000031 12613 1727096141.31293: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096141.31391: no more pending results, returning what we have 12613 1727096141.31395: results queue empty 12613 1727096141.31396: checking for any_errors_fatal 12613 1727096141.31403: done checking for any_errors_fatal 12613 1727096141.31404: checking for max_fail_percentage 12613 1727096141.31406: done checking for max_fail_percentage 12613 1727096141.31407: checking to see if all hosts have failed and the running result is not ok 12613 1727096141.31407: done checking to see if all hosts have failed 12613 1727096141.31408: getting the remaining hosts for this loop 12613 1727096141.31409: done getting the remaining hosts for this loop 12613 1727096141.31413: getting the next task for host managed_node1 12613 1727096141.31419: done getting next task for host managed_node1 12613 1727096141.31423: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12613 1727096141.31426: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096141.31439: getting variables 12613 1727096141.31441: in VariableManager get_vars() 12613 1727096141.31498: Calling all_inventory to load vars for managed_node1 12613 1727096141.31500: Calling groups_inventory to load vars for managed_node1 12613 1727096141.31502: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096141.31512: Calling all_plugins_play to load vars for managed_node1 12613 1727096141.31514: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096141.31517: Calling groups_plugins_play to load vars for managed_node1 12613 1727096141.32144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096141.32946: done with get_vars() 12613 1727096141.32963: done getting variables 12613 1727096141.33037: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:55:41 -0400 (0:00:00.060) 0:00:04.969 ****** 12613 1727096141.33218: entering _queue_task() for managed_node1/service 12613 1727096141.34375: worker is 1 (out of 1 available) 12613 1727096141.34394: exiting _queue_task() for managed_node1/service 12613 1727096141.34406: done queuing things up, now waiting for results queue to drain 12613 1727096141.34407: waiting for pending results... 12613 1727096141.34619: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12613 1727096141.34931: in run() - task 0afff68d-5257-a9dd-d073-000000000032 12613 1727096141.34935: variable 'ansible_search_path' from source: unknown 12613 1727096141.34938: variable 'ansible_search_path' from source: unknown 12613 1727096141.34961: calling self._execute() 12613 1727096141.35073: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096141.35077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096141.35080: variable 'omit' from source: magic vars 12613 1727096141.36105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096141.39184: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096141.39374: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096141.39378: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096141.39380: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096141.39382: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096141.39459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096141.39507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096141.39539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096141.39592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096141.39620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096141.39802: variable 'ansible_distribution' from source: facts 12613 1727096141.39816: variable 'ansible_distribution_major_version' from source: facts 12613 1727096141.39862: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096141.39872: when evaluation is False, skipping this task 12613 1727096141.40270: _execute() done 12613 1727096141.40276: dumping result to json 12613 1727096141.40278: done dumping result, returning 12613 1727096141.40281: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-a9dd-d073-000000000032] 12613 1727096141.40283: sending task result for task 0afff68d-5257-a9dd-d073-000000000032 12613 1727096141.40355: done sending task result for task 0afff68d-5257-a9dd-d073-000000000032 12613 1727096141.40359: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12613 1727096141.40614: no more pending results, returning what we have 12613 1727096141.40618: results queue empty 12613 1727096141.40619: checking for any_errors_fatal 12613 1727096141.40626: done checking for any_errors_fatal 12613 1727096141.40627: checking for max_fail_percentage 12613 1727096141.40630: done checking for max_fail_percentage 12613 1727096141.40631: checking to see if all hosts have failed and the running result is not ok 12613 1727096141.40631: done checking to see if all hosts have failed 12613 1727096141.40632: getting the remaining hosts for this loop 12613 1727096141.40634: done getting the remaining hosts for this loop 12613 1727096141.40638: getting the next task for host managed_node1 12613 1727096141.40646: done getting next task for host managed_node1 12613 1727096141.40650: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12613 1727096141.40653: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096141.40670: getting variables 12613 1727096141.40672: in VariableManager get_vars() 12613 1727096141.40731: Calling all_inventory to load vars for managed_node1 12613 1727096141.40734: Calling groups_inventory to load vars for managed_node1 12613 1727096141.40737: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096141.40749: Calling all_plugins_play to load vars for managed_node1 12613 1727096141.40751: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096141.40755: Calling groups_plugins_play to load vars for managed_node1 12613 1727096141.41514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096141.41925: done with get_vars() 12613 1727096141.41938: done getting variables 12613 1727096141.42053: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:55:41 -0400 (0:00:00.088) 0:00:05.058 ****** 12613 1727096141.42091: entering _queue_task() for managed_node1/service 12613 1727096141.42447: worker is 1 (out of 1 available) 12613 1727096141.42465: exiting _queue_task() for managed_node1/service 12613 1727096141.42479: done queuing things up, now waiting for results queue to drain 12613 1727096141.42481: waiting for pending results... 12613 1727096141.42643: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12613 1727096141.42731: in run() - task 0afff68d-5257-a9dd-d073-000000000033 12613 1727096141.42741: variable 'ansible_search_path' from source: unknown 12613 1727096141.42745: variable 'ansible_search_path' from source: unknown 12613 1727096141.42778: calling self._execute() 12613 1727096141.42840: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096141.42844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096141.42857: variable 'omit' from source: magic vars 12613 1727096141.43167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096141.46706: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096141.46745: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096141.46791: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096141.46829: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096141.46863: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096141.46947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096141.46986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096141.47018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096141.47066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096141.47172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096141.47228: variable 'ansible_distribution' from source: facts 12613 1727096141.47239: variable 'ansible_distribution_major_version' from source: facts 12613 1727096141.47263: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096141.47272: when evaluation is False, skipping this task 12613 1727096141.47278: _execute() done 12613 1727096141.47284: dumping result to json 12613 1727096141.47290: done dumping result, returning 12613 1727096141.47301: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-a9dd-d073-000000000033] 12613 1727096141.47310: sending task result for task 0afff68d-5257-a9dd-d073-000000000033 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096141.47461: no more pending results, returning what we have 12613 1727096141.47464: results queue empty 12613 1727096141.47465: checking for any_errors_fatal 12613 1727096141.47475: done checking for any_errors_fatal 12613 1727096141.47476: checking for max_fail_percentage 12613 1727096141.47477: done checking for max_fail_percentage 12613 1727096141.47478: checking to see if all hosts have failed and the running result is not ok 12613 1727096141.47479: done checking to see if all hosts have failed 12613 1727096141.47479: getting the remaining hosts for this loop 12613 1727096141.47481: done getting the remaining hosts for this loop 12613 1727096141.47484: getting the next task for host managed_node1 12613 1727096141.47491: done getting next task for host managed_node1 12613 1727096141.47494: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12613 1727096141.47497: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096141.47510: getting variables 12613 1727096141.47512: in VariableManager get_vars() 12613 1727096141.47675: Calling all_inventory to load vars for managed_node1 12613 1727096141.47678: Calling groups_inventory to load vars for managed_node1 12613 1727096141.47680: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096141.47688: Calling all_plugins_play to load vars for managed_node1 12613 1727096141.47690: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096141.47693: Calling groups_plugins_play to load vars for managed_node1 12613 1727096141.47879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096141.48219: done with get_vars() 12613 1727096141.48232: done getting variables 12613 1727096141.48518: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:55:41 -0400 (0:00:00.064) 0:00:05.122 ****** 12613 1727096141.48557: entering _queue_task() for managed_node1/service 12613 1727096141.48578: done sending task result for task 0afff68d-5257-a9dd-d073-000000000033 12613 1727096141.48581: WORKER PROCESS EXITING 12613 1727096141.49179: worker is 1 (out of 1 available) 12613 1727096141.49192: exiting _queue_task() for managed_node1/service 12613 1727096141.49203: done queuing things up, now waiting for results queue to drain 12613 1727096141.49204: waiting for pending results... 12613 1727096141.49873: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 12613 1727096141.50374: in run() - task 0afff68d-5257-a9dd-d073-000000000034 12613 1727096141.50378: variable 'ansible_search_path' from source: unknown 12613 1727096141.50385: variable 'ansible_search_path' from source: unknown 12613 1727096141.50388: calling self._execute() 12613 1727096141.50391: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096141.50394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096141.50398: variable 'omit' from source: magic vars 12613 1727096141.51388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096141.54284: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096141.54366: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096141.54612: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096141.54653: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096141.54686: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096141.54815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096141.54906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096141.54991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096141.55105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096141.55123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096141.55538: variable 'ansible_distribution' from source: facts 12613 1727096141.55549: variable 'ansible_distribution_major_version' from source: facts 12613 1727096141.55609: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096141.55652: when evaluation is False, skipping this task 12613 1727096141.55659: _execute() done 12613 1727096141.55664: dumping result to json 12613 1727096141.55674: done dumping result, returning 12613 1727096141.55685: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-a9dd-d073-000000000034] 12613 1727096141.55713: sending task result for task 0afff68d-5257-a9dd-d073-000000000034 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12613 1727096141.55928: no more pending results, returning what we have 12613 1727096141.55932: results queue empty 12613 1727096141.55933: checking for any_errors_fatal 12613 1727096141.55940: done checking for any_errors_fatal 12613 1727096141.55941: checking for max_fail_percentage 12613 1727096141.55942: done checking for max_fail_percentage 12613 1727096141.55943: checking to see if all hosts have failed and the running result is not ok 12613 1727096141.55944: done checking to see if all hosts have failed 12613 1727096141.55945: getting the remaining hosts for this loop 12613 1727096141.55946: done getting the remaining hosts for this loop 12613 1727096141.55950: getting the next task for host managed_node1 12613 1727096141.55959: done getting next task for host managed_node1 12613 1727096141.56038: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12613 1727096141.56042: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096141.56057: getting variables 12613 1727096141.56058: in VariableManager get_vars() 12613 1727096141.56238: Calling all_inventory to load vars for managed_node1 12613 1727096141.56241: Calling groups_inventory to load vars for managed_node1 12613 1727096141.56243: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096141.56256: Calling all_plugins_play to load vars for managed_node1 12613 1727096141.56259: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096141.56263: Calling groups_plugins_play to load vars for managed_node1 12613 1727096141.56518: done sending task result for task 0afff68d-5257-a9dd-d073-000000000034 12613 1727096141.56522: WORKER PROCESS EXITING 12613 1727096141.56535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096141.56827: done with get_vars() 12613 1727096141.56846: done getting variables 12613 1727096141.57086: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:55:41 -0400 (0:00:00.085) 0:00:05.208 ****** 12613 1727096141.57118: entering _queue_task() for managed_node1/copy 12613 1727096141.57846: worker is 1 (out of 1 available) 12613 1727096141.57859: exiting _queue_task() for managed_node1/copy 12613 1727096141.57872: done queuing things up, now waiting for results queue to drain 12613 1727096141.57873: waiting for pending results... 12613 1727096141.58239: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12613 1727096141.58397: in run() - task 0afff68d-5257-a9dd-d073-000000000035 12613 1727096141.58415: variable 'ansible_search_path' from source: unknown 12613 1727096141.58422: variable 'ansible_search_path' from source: unknown 12613 1727096141.58470: calling self._execute() 12613 1727096141.58556: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096141.58570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096141.58589: variable 'omit' from source: magic vars 12613 1727096141.59031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096141.62385: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096141.62390: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096141.62415: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096141.62457: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096141.62493: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096141.62578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096141.62617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096141.62648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096141.62696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096141.62720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096141.62861: variable 'ansible_distribution' from source: facts 12613 1727096141.62874: variable 'ansible_distribution_major_version' from source: facts 12613 1727096141.62895: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096141.62902: when evaluation is False, skipping this task 12613 1727096141.62908: _execute() done 12613 1727096141.62914: dumping result to json 12613 1727096141.62925: done dumping result, returning 12613 1727096141.62936: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-a9dd-d073-000000000035] 12613 1727096141.62945: sending task result for task 0afff68d-5257-a9dd-d073-000000000035 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096141.63241: no more pending results, returning what we have 12613 1727096141.63245: results queue empty 12613 1727096141.63246: checking for any_errors_fatal 12613 1727096141.63254: done checking for any_errors_fatal 12613 1727096141.63255: checking for max_fail_percentage 12613 1727096141.63257: done checking for max_fail_percentage 12613 1727096141.63258: checking to see if all hosts have failed and the running result is not ok 12613 1727096141.63258: done checking to see if all hosts have failed 12613 1727096141.63259: getting the remaining hosts for this loop 12613 1727096141.63261: done getting the remaining hosts for this loop 12613 1727096141.63265: getting the next task for host managed_node1 12613 1727096141.63275: done getting next task for host managed_node1 12613 1727096141.63279: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12613 1727096141.63282: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096141.63295: getting variables 12613 1727096141.63297: in VariableManager get_vars() 12613 1727096141.63355: Calling all_inventory to load vars for managed_node1 12613 1727096141.63358: Calling groups_inventory to load vars for managed_node1 12613 1727096141.63360: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096141.63553: Calling all_plugins_play to load vars for managed_node1 12613 1727096141.63557: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096141.63563: done sending task result for task 0afff68d-5257-a9dd-d073-000000000035 12613 1727096141.63566: WORKER PROCESS EXITING 12613 1727096141.63572: Calling groups_plugins_play to load vars for managed_node1 12613 1727096141.63747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096141.63972: done with get_vars() 12613 1727096141.63983: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:55:41 -0400 (0:00:00.069) 0:00:05.277 ****** 12613 1727096141.64063: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 12613 1727096141.64065: Creating lock for fedora.linux_system_roles.network_connections 12613 1727096141.64345: worker is 1 (out of 1 available) 12613 1727096141.64362: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 12613 1727096141.64477: done queuing things up, now waiting for results queue to drain 12613 1727096141.64479: waiting for pending results... 12613 1727096141.64638: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12613 1727096141.64771: in run() - task 0afff68d-5257-a9dd-d073-000000000036 12613 1727096141.64791: variable 'ansible_search_path' from source: unknown 12613 1727096141.64798: variable 'ansible_search_path' from source: unknown 12613 1727096141.64838: calling self._execute() 12613 1727096141.64926: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096141.64939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096141.64957: variable 'omit' from source: magic vars 12613 1727096141.65461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096141.67777: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096141.67848: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096141.67906: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096141.67945: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096141.67983: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096141.68067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096141.68172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096141.68176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096141.68178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096141.68191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096141.68324: variable 'ansible_distribution' from source: facts 12613 1727096141.68335: variable 'ansible_distribution_major_version' from source: facts 12613 1727096141.68361: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096141.68370: when evaluation is False, skipping this task 12613 1727096141.68377: _execute() done 12613 1727096141.68382: dumping result to json 12613 1727096141.68389: done dumping result, returning 12613 1727096141.68401: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-a9dd-d073-000000000036] 12613 1727096141.68409: sending task result for task 0afff68d-5257-a9dd-d073-000000000036 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096141.68571: no more pending results, returning what we have 12613 1727096141.68574: results queue empty 12613 1727096141.68575: checking for any_errors_fatal 12613 1727096141.68581: done checking for any_errors_fatal 12613 1727096141.68582: checking for max_fail_percentage 12613 1727096141.68584: done checking for max_fail_percentage 12613 1727096141.68585: checking to see if all hosts have failed and the running result is not ok 12613 1727096141.68585: done checking to see if all hosts have failed 12613 1727096141.68586: getting the remaining hosts for this loop 12613 1727096141.68587: done getting the remaining hosts for this loop 12613 1727096141.68591: getting the next task for host managed_node1 12613 1727096141.68597: done getting next task for host managed_node1 12613 1727096141.68601: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12613 1727096141.68603: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096141.68615: getting variables 12613 1727096141.68617: in VariableManager get_vars() 12613 1727096141.68676: Calling all_inventory to load vars for managed_node1 12613 1727096141.68678: Calling groups_inventory to load vars for managed_node1 12613 1727096141.68680: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096141.68691: Calling all_plugins_play to load vars for managed_node1 12613 1727096141.68694: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096141.68697: Calling groups_plugins_play to load vars for managed_node1 12613 1727096141.69021: done sending task result for task 0afff68d-5257-a9dd-d073-000000000036 12613 1727096141.69024: WORKER PROCESS EXITING 12613 1727096141.69048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096141.69265: done with get_vars() 12613 1727096141.69277: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:55:41 -0400 (0:00:00.052) 0:00:05.330 ****** 12613 1727096141.69364: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 12613 1727096141.69366: Creating lock for fedora.linux_system_roles.network_state 12613 1727096141.69663: worker is 1 (out of 1 available) 12613 1727096141.69678: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 12613 1727096141.69691: done queuing things up, now waiting for results queue to drain 12613 1727096141.69692: waiting for pending results... 12613 1727096141.69926: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 12613 1727096141.70055: in run() - task 0afff68d-5257-a9dd-d073-000000000037 12613 1727096141.70078: variable 'ansible_search_path' from source: unknown 12613 1727096141.70086: variable 'ansible_search_path' from source: unknown 12613 1727096141.70126: calling self._execute() 12613 1727096141.70211: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096141.70222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096141.70236: variable 'omit' from source: magic vars 12613 1727096141.70658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096141.73146: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096141.73153: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096141.73157: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096141.73159: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096141.73335: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096141.73421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096141.73458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096141.73490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096141.73532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096141.73553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096141.73694: variable 'ansible_distribution' from source: facts 12613 1727096141.73705: variable 'ansible_distribution_major_version' from source: facts 12613 1727096141.73725: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096141.73732: when evaluation is False, skipping this task 12613 1727096141.73738: _execute() done 12613 1727096141.73744: dumping result to json 12613 1727096141.73753: done dumping result, returning 12613 1727096141.73766: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-a9dd-d073-000000000037] 12613 1727096141.73780: sending task result for task 0afff68d-5257-a9dd-d073-000000000037 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096141.73927: no more pending results, returning what we have 12613 1727096141.73930: results queue empty 12613 1727096141.73931: checking for any_errors_fatal 12613 1727096141.73936: done checking for any_errors_fatal 12613 1727096141.73937: checking for max_fail_percentage 12613 1727096141.73938: done checking for max_fail_percentage 12613 1727096141.73939: checking to see if all hosts have failed and the running result is not ok 12613 1727096141.73940: done checking to see if all hosts have failed 12613 1727096141.73941: getting the remaining hosts for this loop 12613 1727096141.73942: done getting the remaining hosts for this loop 12613 1727096141.73945: getting the next task for host managed_node1 12613 1727096141.73952: done getting next task for host managed_node1 12613 1727096141.73956: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12613 1727096141.73959: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096141.74044: getting variables 12613 1727096141.74047: in VariableManager get_vars() 12613 1727096141.74198: Calling all_inventory to load vars for managed_node1 12613 1727096141.74201: Calling groups_inventory to load vars for managed_node1 12613 1727096141.74204: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096141.74214: Calling all_plugins_play to load vars for managed_node1 12613 1727096141.74216: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096141.74219: Calling groups_plugins_play to load vars for managed_node1 12613 1727096141.74573: done sending task result for task 0afff68d-5257-a9dd-d073-000000000037 12613 1727096141.74577: WORKER PROCESS EXITING 12613 1727096141.74590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096141.75111: done with get_vars() 12613 1727096141.75124: done getting variables 12613 1727096141.75186: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:55:41 -0400 (0:00:00.058) 0:00:05.389 ****** 12613 1727096141.75220: entering _queue_task() for managed_node1/debug 12613 1727096141.75847: worker is 1 (out of 1 available) 12613 1727096141.75864: exiting _queue_task() for managed_node1/debug 12613 1727096141.75878: done queuing things up, now waiting for results queue to drain 12613 1727096141.75880: waiting for pending results... 12613 1727096141.76347: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12613 1727096141.76697: in run() - task 0afff68d-5257-a9dd-d073-000000000038 12613 1727096141.76713: variable 'ansible_search_path' from source: unknown 12613 1727096141.76718: variable 'ansible_search_path' from source: unknown 12613 1727096141.76834: calling self._execute() 12613 1727096141.76955: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096141.76959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096141.76969: variable 'omit' from source: magic vars 12613 1727096141.77976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096141.81077: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096141.81185: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096141.81250: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096141.81377: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096141.81380: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096141.81431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096141.81486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096141.81518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096141.81582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096141.81617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096141.81775: variable 'ansible_distribution' from source: facts 12613 1727096141.81788: variable 'ansible_distribution_major_version' from source: facts 12613 1727096141.81818: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096141.81835: when evaluation is False, skipping this task 12613 1727096141.81872: _execute() done 12613 1727096141.81875: dumping result to json 12613 1727096141.81877: done dumping result, returning 12613 1727096141.81880: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-a9dd-d073-000000000038] 12613 1727096141.81882: sending task result for task 0afff68d-5257-a9dd-d073-000000000038 12613 1727096141.82315: done sending task result for task 0afff68d-5257-a9dd-d073-000000000038 12613 1727096141.82319: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096141.82366: no more pending results, returning what we have 12613 1727096141.82373: results queue empty 12613 1727096141.82374: checking for any_errors_fatal 12613 1727096141.82378: done checking for any_errors_fatal 12613 1727096141.82379: checking for max_fail_percentage 12613 1727096141.82381: done checking for max_fail_percentage 12613 1727096141.82382: checking to see if all hosts have failed and the running result is not ok 12613 1727096141.82383: done checking to see if all hosts have failed 12613 1727096141.82383: getting the remaining hosts for this loop 12613 1727096141.82385: done getting the remaining hosts for this loop 12613 1727096141.82388: getting the next task for host managed_node1 12613 1727096141.82395: done getting next task for host managed_node1 12613 1727096141.82401: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12613 1727096141.82404: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096141.82420: getting variables 12613 1727096141.82422: in VariableManager get_vars() 12613 1727096141.82540: Calling all_inventory to load vars for managed_node1 12613 1727096141.82543: Calling groups_inventory to load vars for managed_node1 12613 1727096141.82546: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096141.82559: Calling all_plugins_play to load vars for managed_node1 12613 1727096141.82562: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096141.82565: Calling groups_plugins_play to load vars for managed_node1 12613 1727096141.82945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096141.83156: done with get_vars() 12613 1727096141.83170: done getting variables 12613 1727096141.83261: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:55:41 -0400 (0:00:00.080) 0:00:05.470 ****** 12613 1727096141.83294: entering _queue_task() for managed_node1/debug 12613 1727096141.83619: worker is 1 (out of 1 available) 12613 1727096141.83632: exiting _queue_task() for managed_node1/debug 12613 1727096141.83643: done queuing things up, now waiting for results queue to drain 12613 1727096141.83644: waiting for pending results... 12613 1727096141.83921: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12613 1727096141.84091: in run() - task 0afff68d-5257-a9dd-d073-000000000039 12613 1727096141.84095: variable 'ansible_search_path' from source: unknown 12613 1727096141.84098: variable 'ansible_search_path' from source: unknown 12613 1727096141.84120: calling self._execute() 12613 1727096141.84207: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096141.84218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096141.84472: variable 'omit' from source: magic vars 12613 1727096141.84666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096141.88649: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096141.88855: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096141.88995: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096141.89044: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096141.89158: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096141.89315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096141.89380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096141.89488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096141.89621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096141.89640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096141.89932: variable 'ansible_distribution' from source: facts 12613 1727096141.90007: variable 'ansible_distribution_major_version' from source: facts 12613 1727096141.90044: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096141.90274: when evaluation is False, skipping this task 12613 1727096141.90278: _execute() done 12613 1727096141.90280: dumping result to json 12613 1727096141.90282: done dumping result, returning 12613 1727096141.90284: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-a9dd-d073-000000000039] 12613 1727096141.90286: sending task result for task 0afff68d-5257-a9dd-d073-000000000039 skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096141.90423: no more pending results, returning what we have 12613 1727096141.90427: results queue empty 12613 1727096141.90428: checking for any_errors_fatal 12613 1727096141.90433: done checking for any_errors_fatal 12613 1727096141.90433: checking for max_fail_percentage 12613 1727096141.90435: done checking for max_fail_percentage 12613 1727096141.90436: checking to see if all hosts have failed and the running result is not ok 12613 1727096141.90436: done checking to see if all hosts have failed 12613 1727096141.90437: getting the remaining hosts for this loop 12613 1727096141.90438: done getting the remaining hosts for this loop 12613 1727096141.90442: getting the next task for host managed_node1 12613 1727096141.90449: done getting next task for host managed_node1 12613 1727096141.90455: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12613 1727096141.90458: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096141.90475: getting variables 12613 1727096141.90478: in VariableManager get_vars() 12613 1727096141.90536: Calling all_inventory to load vars for managed_node1 12613 1727096141.90539: Calling groups_inventory to load vars for managed_node1 12613 1727096141.90541: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096141.90555: Calling all_plugins_play to load vars for managed_node1 12613 1727096141.90558: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096141.90562: Calling groups_plugins_play to load vars for managed_node1 12613 1727096141.91421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096141.91761: done with get_vars() 12613 1727096141.91775: done getting variables 12613 1727096141.92522: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:55:41 -0400 (0:00:00.092) 0:00:05.562 ****** 12613 1727096141.92563: entering _queue_task() for managed_node1/debug 12613 1727096141.93410: done sending task result for task 0afff68d-5257-a9dd-d073-000000000039 12613 1727096141.93448: WORKER PROCESS EXITING 12613 1727096141.93438: worker is 1 (out of 1 available) 12613 1727096141.93460: exiting _queue_task() for managed_node1/debug 12613 1727096141.93472: done queuing things up, now waiting for results queue to drain 12613 1727096141.93473: waiting for pending results... 12613 1727096141.93675: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12613 1727096141.94376: in run() - task 0afff68d-5257-a9dd-d073-00000000003a 12613 1727096141.94400: variable 'ansible_search_path' from source: unknown 12613 1727096141.94408: variable 'ansible_search_path' from source: unknown 12613 1727096141.94454: calling self._execute() 12613 1727096141.94541: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096141.94690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096141.94708: variable 'omit' from source: magic vars 12613 1727096141.95594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096142.00712: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096142.00871: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096142.00908: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096142.01097: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096142.01125: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096142.01403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096142.01429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096142.01452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096142.01658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096142.01661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096142.01876: variable 'ansible_distribution' from source: facts 12613 1727096142.01882: variable 'ansible_distribution_major_version' from source: facts 12613 1727096142.02015: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096142.02018: when evaluation is False, skipping this task 12613 1727096142.02021: _execute() done 12613 1727096142.02023: dumping result to json 12613 1727096142.02025: done dumping result, returning 12613 1727096142.02034: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-a9dd-d073-00000000003a] 12613 1727096142.02039: sending task result for task 0afff68d-5257-a9dd-d073-00000000003a skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096142.02190: no more pending results, returning what we have 12613 1727096142.02194: results queue empty 12613 1727096142.02195: checking for any_errors_fatal 12613 1727096142.02200: done checking for any_errors_fatal 12613 1727096142.02201: checking for max_fail_percentage 12613 1727096142.02203: done checking for max_fail_percentage 12613 1727096142.02204: checking to see if all hosts have failed and the running result is not ok 12613 1727096142.02205: done checking to see if all hosts have failed 12613 1727096142.02205: getting the remaining hosts for this loop 12613 1727096142.02207: done getting the remaining hosts for this loop 12613 1727096142.02210: getting the next task for host managed_node1 12613 1727096142.02217: done getting next task for host managed_node1 12613 1727096142.02222: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12613 1727096142.02225: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096142.02240: getting variables 12613 1727096142.02242: in VariableManager get_vars() 12613 1727096142.02302: Calling all_inventory to load vars for managed_node1 12613 1727096142.02305: Calling groups_inventory to load vars for managed_node1 12613 1727096142.02308: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096142.02319: Calling all_plugins_play to load vars for managed_node1 12613 1727096142.02322: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096142.02326: Calling groups_plugins_play to load vars for managed_node1 12613 1727096142.02887: done sending task result for task 0afff68d-5257-a9dd-d073-00000000003a 12613 1727096142.02890: WORKER PROCESS EXITING 12613 1727096142.03287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096142.03711: done with get_vars() 12613 1727096142.03725: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:55:42 -0400 (0:00:00.112) 0:00:05.675 ****** 12613 1727096142.03823: entering _queue_task() for managed_node1/ping 12613 1727096142.03825: Creating lock for ping 12613 1727096142.04464: worker is 1 (out of 1 available) 12613 1727096142.04482: exiting _queue_task() for managed_node1/ping 12613 1727096142.04495: done queuing things up, now waiting for results queue to drain 12613 1727096142.04496: waiting for pending results... 12613 1727096142.04941: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 12613 1727096142.05431: in run() - task 0afff68d-5257-a9dd-d073-00000000003b 12613 1727096142.05435: variable 'ansible_search_path' from source: unknown 12613 1727096142.05437: variable 'ansible_search_path' from source: unknown 12613 1727096142.05440: calling self._execute() 12613 1727096142.05476: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096142.05547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096142.05564: variable 'omit' from source: magic vars 12613 1727096142.06403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096142.09193: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096142.09278: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096142.09325: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096142.09363: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096142.09395: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096142.09499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096142.09545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096142.09585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096142.09631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096142.09655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096142.09808: variable 'ansible_distribution' from source: facts 12613 1727096142.09820: variable 'ansible_distribution_major_version' from source: facts 12613 1727096142.09846: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096142.09855: when evaluation is False, skipping this task 12613 1727096142.09882: _execute() done 12613 1727096142.09885: dumping result to json 12613 1727096142.09888: done dumping result, returning 12613 1727096142.09988: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-a9dd-d073-00000000003b] 12613 1727096142.09991: sending task result for task 0afff68d-5257-a9dd-d073-00000000003b 12613 1727096142.10050: done sending task result for task 0afff68d-5257-a9dd-d073-00000000003b 12613 1727096142.10056: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096142.10126: no more pending results, returning what we have 12613 1727096142.10129: results queue empty 12613 1727096142.10130: checking for any_errors_fatal 12613 1727096142.10135: done checking for any_errors_fatal 12613 1727096142.10136: checking for max_fail_percentage 12613 1727096142.10138: done checking for max_fail_percentage 12613 1727096142.10139: checking to see if all hosts have failed and the running result is not ok 12613 1727096142.10140: done checking to see if all hosts have failed 12613 1727096142.10140: getting the remaining hosts for this loop 12613 1727096142.10142: done getting the remaining hosts for this loop 12613 1727096142.10145: getting the next task for host managed_node1 12613 1727096142.10156: done getting next task for host managed_node1 12613 1727096142.10158: ^ task is: TASK: meta (role_complete) 12613 1727096142.10161: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096142.10177: getting variables 12613 1727096142.10178: in VariableManager get_vars() 12613 1727096142.10225: Calling all_inventory to load vars for managed_node1 12613 1727096142.10229: Calling groups_inventory to load vars for managed_node1 12613 1727096142.10231: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096142.10239: Calling all_plugins_play to load vars for managed_node1 12613 1727096142.10241: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096142.10244: Calling groups_plugins_play to load vars for managed_node1 12613 1727096142.10906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096142.11640: done with get_vars() 12613 1727096142.11656: done getting variables 12613 1727096142.11737: done queuing things up, now waiting for results queue to drain 12613 1727096142.11739: results queue empty 12613 1727096142.11740: checking for any_errors_fatal 12613 1727096142.11745: done checking for any_errors_fatal 12613 1727096142.11746: checking for max_fail_percentage 12613 1727096142.11747: done checking for max_fail_percentage 12613 1727096142.11748: checking to see if all hosts have failed and the running result is not ok 12613 1727096142.11748: done checking to see if all hosts have failed 12613 1727096142.11749: getting the remaining hosts for this loop 12613 1727096142.11750: done getting the remaining hosts for this loop 12613 1727096142.11754: getting the next task for host managed_node1 12613 1727096142.11760: done getting next task for host managed_node1 12613 1727096142.11762: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12613 1727096142.11764: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096142.11766: getting variables 12613 1727096142.12070: in VariableManager get_vars() 12613 1727096142.12094: Calling all_inventory to load vars for managed_node1 12613 1727096142.12096: Calling groups_inventory to load vars for managed_node1 12613 1727096142.12098: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096142.12103: Calling all_plugins_play to load vars for managed_node1 12613 1727096142.12111: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096142.12114: Calling groups_plugins_play to load vars for managed_node1 12613 1727096142.12258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096142.13097: done with get_vars() 12613 1727096142.13108: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 08:55:42 -0400 (0:00:00.093) 0:00:05.769 ****** 12613 1727096142.13186: entering _queue_task() for managed_node1/include_tasks 12613 1727096142.14103: worker is 1 (out of 1 available) 12613 1727096142.14112: exiting _queue_task() for managed_node1/include_tasks 12613 1727096142.14121: done queuing things up, now waiting for results queue to drain 12613 1727096142.14123: waiting for pending results... 12613 1727096142.14386: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 12613 1727096142.14578: in run() - task 0afff68d-5257-a9dd-d073-00000000006e 12613 1727096142.14597: variable 'ansible_search_path' from source: unknown 12613 1727096142.14604: variable 'ansible_search_path' from source: unknown 12613 1727096142.14705: calling self._execute() 12613 1727096142.14943: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096142.14947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096142.14949: variable 'omit' from source: magic vars 12613 1727096142.15813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096142.18657: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096142.18877: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096142.18881: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096142.18884: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096142.18887: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096142.18894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096142.18924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096142.18947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096142.18991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096142.19004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096142.19140: variable 'ansible_distribution' from source: facts 12613 1727096142.19144: variable 'ansible_distribution_major_version' from source: facts 12613 1727096142.19219: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096142.19222: when evaluation is False, skipping this task 12613 1727096142.19224: _execute() done 12613 1727096142.19227: dumping result to json 12613 1727096142.19228: done dumping result, returning 12613 1727096142.19230: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-a9dd-d073-00000000006e] 12613 1727096142.19232: sending task result for task 0afff68d-5257-a9dd-d073-00000000006e 12613 1727096142.19295: done sending task result for task 0afff68d-5257-a9dd-d073-00000000006e 12613 1727096142.19297: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096142.19364: no more pending results, returning what we have 12613 1727096142.19368: results queue empty 12613 1727096142.19370: checking for any_errors_fatal 12613 1727096142.19371: done checking for any_errors_fatal 12613 1727096142.19371: checking for max_fail_percentage 12613 1727096142.19373: done checking for max_fail_percentage 12613 1727096142.19374: checking to see if all hosts have failed and the running result is not ok 12613 1727096142.19375: done checking to see if all hosts have failed 12613 1727096142.19375: getting the remaining hosts for this loop 12613 1727096142.19376: done getting the remaining hosts for this loop 12613 1727096142.19380: getting the next task for host managed_node1 12613 1727096142.19390: done getting next task for host managed_node1 12613 1727096142.19394: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12613 1727096142.19397: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096142.19400: getting variables 12613 1727096142.19402: in VariableManager get_vars() 12613 1727096142.19457: Calling all_inventory to load vars for managed_node1 12613 1727096142.19460: Calling groups_inventory to load vars for managed_node1 12613 1727096142.19462: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096142.19472: Calling all_plugins_play to load vars for managed_node1 12613 1727096142.19474: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096142.19477: Calling groups_plugins_play to load vars for managed_node1 12613 1727096142.19962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096142.20364: done with get_vars() 12613 1727096142.20376: done getting variables 12613 1727096142.20544: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12613 1727096142.20779: variable 'interface' from source: task vars 12613 1727096142.20783: variable 'controller_device' from source: play vars 12613 1727096142.20842: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 08:55:42 -0400 (0:00:00.077) 0:00:05.847 ****** 12613 1727096142.20981: entering _queue_task() for managed_node1/assert 12613 1727096142.21461: worker is 1 (out of 1 available) 12613 1727096142.21676: exiting _queue_task() for managed_node1/assert 12613 1727096142.21687: done queuing things up, now waiting for results queue to drain 12613 1727096142.21688: waiting for pending results... 12613 1727096142.22285: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'nm-bond' 12613 1727096142.22289: in run() - task 0afff68d-5257-a9dd-d073-00000000006f 12613 1727096142.22299: variable 'ansible_search_path' from source: unknown 12613 1727096142.22305: variable 'ansible_search_path' from source: unknown 12613 1727096142.22341: calling self._execute() 12613 1727096142.22462: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096142.22604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096142.22617: variable 'omit' from source: magic vars 12613 1727096142.23356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096142.28409: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096142.28583: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096142.28842: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096142.28846: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096142.28848: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096142.29008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096142.29043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096142.29196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096142.29239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096142.29257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096142.29507: variable 'ansible_distribution' from source: facts 12613 1727096142.29518: variable 'ansible_distribution_major_version' from source: facts 12613 1727096142.29593: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096142.29708: when evaluation is False, skipping this task 12613 1727096142.29711: _execute() done 12613 1727096142.29714: dumping result to json 12613 1727096142.29716: done dumping result, returning 12613 1727096142.29718: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'nm-bond' [0afff68d-5257-a9dd-d073-00000000006f] 12613 1727096142.29720: sending task result for task 0afff68d-5257-a9dd-d073-00000000006f 12613 1727096142.29786: done sending task result for task 0afff68d-5257-a9dd-d073-00000000006f 12613 1727096142.29788: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096142.29857: no more pending results, returning what we have 12613 1727096142.29861: results queue empty 12613 1727096142.29862: checking for any_errors_fatal 12613 1727096142.29869: done checking for any_errors_fatal 12613 1727096142.29869: checking for max_fail_percentage 12613 1727096142.29871: done checking for max_fail_percentage 12613 1727096142.29872: checking to see if all hosts have failed and the running result is not ok 12613 1727096142.29873: done checking to see if all hosts have failed 12613 1727096142.29874: getting the remaining hosts for this loop 12613 1727096142.29875: done getting the remaining hosts for this loop 12613 1727096142.29879: getting the next task for host managed_node1 12613 1727096142.29888: done getting next task for host managed_node1 12613 1727096142.29891: ^ task is: TASK: Include the task 'assert_profile_present.yml' 12613 1727096142.29893: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096142.29897: getting variables 12613 1727096142.29898: in VariableManager get_vars() 12613 1727096142.29953: Calling all_inventory to load vars for managed_node1 12613 1727096142.29956: Calling groups_inventory to load vars for managed_node1 12613 1727096142.29959: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096142.30274: Calling all_plugins_play to load vars for managed_node1 12613 1727096142.30278: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096142.30282: Calling groups_plugins_play to load vars for managed_node1 12613 1727096142.30458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096142.31016: done with get_vars() 12613 1727096142.31032: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:67 Monday 23 September 2024 08:55:42 -0400 (0:00:00.102) 0:00:05.949 ****** 12613 1727096142.31233: entering _queue_task() for managed_node1/include_tasks 12613 1727096142.31829: worker is 1 (out of 1 available) 12613 1727096142.31842: exiting _queue_task() for managed_node1/include_tasks 12613 1727096142.31855: done queuing things up, now waiting for results queue to drain 12613 1727096142.31856: waiting for pending results... 12613 1727096142.32484: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' 12613 1727096142.32528: in run() - task 0afff68d-5257-a9dd-d073-000000000070 12613 1727096142.32540: variable 'ansible_search_path' from source: unknown 12613 1727096142.32709: variable 'controller_profile' from source: play vars 12613 1727096142.33222: variable 'controller_profile' from source: play vars 12613 1727096142.33237: variable 'port1_profile' from source: play vars 12613 1727096142.33308: variable 'port1_profile' from source: play vars 12613 1727096142.33315: variable 'port2_profile' from source: play vars 12613 1727096142.33503: variable 'port2_profile' from source: play vars 12613 1727096142.33516: variable 'omit' from source: magic vars 12613 1727096142.33876: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096142.33879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096142.33882: variable 'omit' from source: magic vars 12613 1727096142.34466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096142.36996: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096142.37072: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096142.37115: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096142.37160: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096142.37194: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096142.37285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096142.37320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096142.37349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096142.37402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096142.37420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096142.37549: variable 'ansible_distribution' from source: facts 12613 1727096142.37560: variable 'ansible_distribution_major_version' from source: facts 12613 1727096142.37589: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096142.37597: when evaluation is False, skipping this task 12613 1727096142.37632: variable 'item' from source: unknown 12613 1727096142.37724: variable 'item' from source: unknown skipping: [managed_node1] => (item=bond0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "item": "bond0", "skip_reason": "Conditional result was False" } 12613 1727096142.38077: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096142.38080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096142.38083: variable 'omit' from source: magic vars 12613 1727096142.38199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096142.38276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096142.38280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096142.38296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096142.38316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096142.38411: variable 'ansible_distribution' from source: facts 12613 1727096142.38423: variable 'ansible_distribution_major_version' from source: facts 12613 1727096142.38435: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096142.38441: when evaluation is False, skipping this task 12613 1727096142.38646: variable 'item' from source: unknown 12613 1727096142.38947: variable 'item' from source: unknown skipping: [managed_node1] => (item=bond0.0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "item": "bond0.0", "skip_reason": "Conditional result was False" } 12613 1727096142.39019: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096142.39022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096142.39025: variable 'omit' from source: magic vars 12613 1727096142.39576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096142.39581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096142.39586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096142.39604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096142.39622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096142.39975: variable 'ansible_distribution' from source: facts 12613 1727096142.39978: variable 'ansible_distribution_major_version' from source: facts 12613 1727096142.39980: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096142.39982: when evaluation is False, skipping this task 12613 1727096142.39984: variable 'item' from source: unknown 12613 1727096142.40039: variable 'item' from source: unknown skipping: [managed_node1] => (item=bond0.1) => { "ansible_loop_var": "item", "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "item": "bond0.1", "skip_reason": "Conditional result was False" } 12613 1727096142.40274: dumping result to json 12613 1727096142.40277: done dumping result, returning 12613 1727096142.40280: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' [0afff68d-5257-a9dd-d073-000000000070] 12613 1727096142.40282: sending task result for task 0afff68d-5257-a9dd-d073-000000000070 12613 1727096142.40325: done sending task result for task 0afff68d-5257-a9dd-d073-000000000070 12613 1727096142.40328: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false } MSG: All items skipped 12613 1727096142.40391: no more pending results, returning what we have 12613 1727096142.40395: results queue empty 12613 1727096142.40396: checking for any_errors_fatal 12613 1727096142.40400: done checking for any_errors_fatal 12613 1727096142.40401: checking for max_fail_percentage 12613 1727096142.40402: done checking for max_fail_percentage 12613 1727096142.40403: checking to see if all hosts have failed and the running result is not ok 12613 1727096142.40403: done checking to see if all hosts have failed 12613 1727096142.40404: getting the remaining hosts for this loop 12613 1727096142.40405: done getting the remaining hosts for this loop 12613 1727096142.40409: getting the next task for host managed_node1 12613 1727096142.40415: done getting next task for host managed_node1 12613 1727096142.40418: ^ task is: TASK: ** TEST check polling interval 12613 1727096142.40420: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096142.40423: getting variables 12613 1727096142.40425: in VariableManager get_vars() 12613 1727096142.40491: Calling all_inventory to load vars for managed_node1 12613 1727096142.40494: Calling groups_inventory to load vars for managed_node1 12613 1727096142.40496: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096142.40507: Calling all_plugins_play to load vars for managed_node1 12613 1727096142.40510: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096142.40513: Calling groups_plugins_play to load vars for managed_node1 12613 1727096142.40928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096142.41734: done with get_vars() 12613 1727096142.41747: done getting variables 12613 1727096142.41814: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:75 Monday 23 September 2024 08:55:42 -0400 (0:00:00.106) 0:00:06.055 ****** 12613 1727096142.41845: entering _queue_task() for managed_node1/command 12613 1727096142.42547: worker is 1 (out of 1 available) 12613 1727096142.42560: exiting _queue_task() for managed_node1/command 12613 1727096142.42676: done queuing things up, now waiting for results queue to drain 12613 1727096142.42678: waiting for pending results... 12613 1727096142.43282: running TaskExecutor() for managed_node1/TASK: ** TEST check polling interval 12613 1727096142.43286: in run() - task 0afff68d-5257-a9dd-d073-000000000071 12613 1727096142.43289: variable 'ansible_search_path' from source: unknown 12613 1727096142.43401: calling self._execute() 12613 1727096142.43492: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096142.43674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096142.43678: variable 'omit' from source: magic vars 12613 1727096142.44500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096142.47665: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096142.47876: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096142.47880: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096142.47883: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096142.47885: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096142.47887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096142.47910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096142.47939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096142.47983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096142.47999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096142.48131: variable 'ansible_distribution' from source: facts 12613 1727096142.48137: variable 'ansible_distribution_major_version' from source: facts 12613 1727096142.48274: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096142.48277: when evaluation is False, skipping this task 12613 1727096142.48279: _execute() done 12613 1727096142.48281: dumping result to json 12613 1727096142.48282: done dumping result, returning 12613 1727096142.48284: done running TaskExecutor() for managed_node1/TASK: ** TEST check polling interval [0afff68d-5257-a9dd-d073-000000000071] 12613 1727096142.48286: sending task result for task 0afff68d-5257-a9dd-d073-000000000071 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096142.48474: no more pending results, returning what we have 12613 1727096142.48477: results queue empty 12613 1727096142.48478: checking for any_errors_fatal 12613 1727096142.48485: done checking for any_errors_fatal 12613 1727096142.48485: checking for max_fail_percentage 12613 1727096142.48487: done checking for max_fail_percentage 12613 1727096142.48488: checking to see if all hosts have failed and the running result is not ok 12613 1727096142.48488: done checking to see if all hosts have failed 12613 1727096142.48489: getting the remaining hosts for this loop 12613 1727096142.48490: done getting the remaining hosts for this loop 12613 1727096142.48493: getting the next task for host managed_node1 12613 1727096142.48498: done getting next task for host managed_node1 12613 1727096142.48500: ^ task is: TASK: ** TEST check IPv4 12613 1727096142.48502: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096142.48505: getting variables 12613 1727096142.48506: in VariableManager get_vars() 12613 1727096142.48552: Calling all_inventory to load vars for managed_node1 12613 1727096142.48555: Calling groups_inventory to load vars for managed_node1 12613 1727096142.48557: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096142.48566: Calling all_plugins_play to load vars for managed_node1 12613 1727096142.48572: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096142.48575: Calling groups_plugins_play to load vars for managed_node1 12613 1727096142.48883: done sending task result for task 0afff68d-5257-a9dd-d073-000000000071 12613 1727096142.48894: WORKER PROCESS EXITING 12613 1727096142.48947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096142.49170: done with get_vars() 12613 1727096142.49182: done getting variables 12613 1727096142.49245: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:80 Monday 23 September 2024 08:55:42 -0400 (0:00:00.074) 0:00:06.129 ****** 12613 1727096142.49276: entering _queue_task() for managed_node1/command 12613 1727096142.49561: worker is 1 (out of 1 available) 12613 1727096142.49677: exiting _queue_task() for managed_node1/command 12613 1727096142.49688: done queuing things up, now waiting for results queue to drain 12613 1727096142.49690: waiting for pending results... 12613 1727096142.49873: running TaskExecutor() for managed_node1/TASK: ** TEST check IPv4 12613 1727096142.49967: in run() - task 0afff68d-5257-a9dd-d073-000000000072 12613 1727096142.50173: variable 'ansible_search_path' from source: unknown 12613 1727096142.50177: calling self._execute() 12613 1727096142.50182: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096142.50188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096142.50191: variable 'omit' from source: magic vars 12613 1727096142.50917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096142.54352: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096142.54437: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096142.54478: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096142.54517: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096142.54548: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096142.54637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096142.54670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096142.54702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096142.54749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096142.54767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096142.54894: variable 'ansible_distribution' from source: facts 12613 1727096142.54905: variable 'ansible_distribution_major_version' from source: facts 12613 1727096142.54930: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096142.54942: when evaluation is False, skipping this task 12613 1727096142.54948: _execute() done 12613 1727096142.55058: dumping result to json 12613 1727096142.55061: done dumping result, returning 12613 1727096142.55063: done running TaskExecutor() for managed_node1/TASK: ** TEST check IPv4 [0afff68d-5257-a9dd-d073-000000000072] 12613 1727096142.55065: sending task result for task 0afff68d-5257-a9dd-d073-000000000072 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096142.55422: no more pending results, returning what we have 12613 1727096142.55425: results queue empty 12613 1727096142.55427: checking for any_errors_fatal 12613 1727096142.55435: done checking for any_errors_fatal 12613 1727096142.55436: checking for max_fail_percentage 12613 1727096142.55438: done checking for max_fail_percentage 12613 1727096142.55438: checking to see if all hosts have failed and the running result is not ok 12613 1727096142.55439: done checking to see if all hosts have failed 12613 1727096142.55440: getting the remaining hosts for this loop 12613 1727096142.55441: done getting the remaining hosts for this loop 12613 1727096142.55445: getting the next task for host managed_node1 12613 1727096142.55451: done getting next task for host managed_node1 12613 1727096142.55454: ^ task is: TASK: ** TEST check IPv6 12613 1727096142.55456: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096142.55459: getting variables 12613 1727096142.55461: in VariableManager get_vars() 12613 1727096142.56413: done sending task result for task 0afff68d-5257-a9dd-d073-000000000072 12613 1727096142.56417: WORKER PROCESS EXITING 12613 1727096142.56418: Calling all_inventory to load vars for managed_node1 12613 1727096142.56421: Calling groups_inventory to load vars for managed_node1 12613 1727096142.56424: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096142.56433: Calling all_plugins_play to load vars for managed_node1 12613 1727096142.56435: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096142.56438: Calling groups_plugins_play to load vars for managed_node1 12613 1727096142.56681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096142.56919: done with get_vars() 12613 1727096142.56930: done getting variables 12613 1727096142.57221: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:87 Monday 23 September 2024 08:55:42 -0400 (0:00:00.079) 0:00:06.209 ****** 12613 1727096142.57247: entering _queue_task() for managed_node1/command 12613 1727096142.57785: worker is 1 (out of 1 available) 12613 1727096142.57798: exiting _queue_task() for managed_node1/command 12613 1727096142.57809: done queuing things up, now waiting for results queue to drain 12613 1727096142.57810: waiting for pending results... 12613 1727096142.58384: running TaskExecutor() for managed_node1/TASK: ** TEST check IPv6 12613 1727096142.58390: in run() - task 0afff68d-5257-a9dd-d073-000000000073 12613 1727096142.58460: variable 'ansible_search_path' from source: unknown 12613 1727096142.58520: calling self._execute() 12613 1727096142.58620: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096142.58634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096142.58649: variable 'omit' from source: magic vars 12613 1727096142.59097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096142.61681: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096142.61748: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096142.61791: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096142.61827: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096142.61855: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096142.61939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096142.61969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096142.62001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096142.62041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096142.62057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096142.62218: variable 'ansible_distribution' from source: facts 12613 1727096142.62229: variable 'ansible_distribution_major_version' from source: facts 12613 1727096142.62263: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096142.62267: when evaluation is False, skipping this task 12613 1727096142.62305: _execute() done 12613 1727096142.62309: dumping result to json 12613 1727096142.62320: done dumping result, returning 12613 1727096142.62329: done running TaskExecutor() for managed_node1/TASK: ** TEST check IPv6 [0afff68d-5257-a9dd-d073-000000000073] 12613 1727096142.62332: sending task result for task 0afff68d-5257-a9dd-d073-000000000073 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096142.62482: no more pending results, returning what we have 12613 1727096142.62486: results queue empty 12613 1727096142.62487: checking for any_errors_fatal 12613 1727096142.62493: done checking for any_errors_fatal 12613 1727096142.62494: checking for max_fail_percentage 12613 1727096142.62496: done checking for max_fail_percentage 12613 1727096142.62497: checking to see if all hosts have failed and the running result is not ok 12613 1727096142.62498: done checking to see if all hosts have failed 12613 1727096142.62499: getting the remaining hosts for this loop 12613 1727096142.62501: done getting the remaining hosts for this loop 12613 1727096142.62504: getting the next task for host managed_node1 12613 1727096142.62574: done getting next task for host managed_node1 12613 1727096142.62581: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12613 1727096142.62584: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096142.62601: done sending task result for task 0afff68d-5257-a9dd-d073-000000000073 12613 1727096142.62616: getting variables 12613 1727096142.62824: in VariableManager get_vars() 12613 1727096142.62873: Calling all_inventory to load vars for managed_node1 12613 1727096142.62876: Calling groups_inventory to load vars for managed_node1 12613 1727096142.62878: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096142.62887: Calling all_plugins_play to load vars for managed_node1 12613 1727096142.62889: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096142.62891: Calling groups_plugins_play to load vars for managed_node1 12613 1727096142.63225: WORKER PROCESS EXITING 12613 1727096142.63240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096142.63487: done with get_vars() 12613 1727096142.63500: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:55:42 -0400 (0:00:00.063) 0:00:06.273 ****** 12613 1727096142.63597: entering _queue_task() for managed_node1/include_tasks 12613 1727096142.63889: worker is 1 (out of 1 available) 12613 1727096142.63902: exiting _queue_task() for managed_node1/include_tasks 12613 1727096142.63914: done queuing things up, now waiting for results queue to drain 12613 1727096142.63916: waiting for pending results... 12613 1727096142.64384: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12613 1727096142.64389: in run() - task 0afff68d-5257-a9dd-d073-00000000007b 12613 1727096142.64392: variable 'ansible_search_path' from source: unknown 12613 1727096142.64395: variable 'ansible_search_path' from source: unknown 12613 1727096142.64397: calling self._execute() 12613 1727096142.64473: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096142.64484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096142.64497: variable 'omit' from source: magic vars 12613 1727096142.64926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096142.67180: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096142.67275: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096142.67318: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096142.67363: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096142.67395: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096142.67491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096142.67526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096142.67563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096142.67610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096142.67631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096142.67781: variable 'ansible_distribution' from source: facts 12613 1727096142.67883: variable 'ansible_distribution_major_version' from source: facts 12613 1727096142.67887: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096142.67889: when evaluation is False, skipping this task 12613 1727096142.67892: _execute() done 12613 1727096142.67895: dumping result to json 12613 1727096142.67897: done dumping result, returning 12613 1727096142.67899: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-a9dd-d073-00000000007b] 12613 1727096142.67902: sending task result for task 0afff68d-5257-a9dd-d073-00000000007b 12613 1727096142.67977: done sending task result for task 0afff68d-5257-a9dd-d073-00000000007b 12613 1727096142.67980: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096142.68036: no more pending results, returning what we have 12613 1727096142.68040: results queue empty 12613 1727096142.68041: checking for any_errors_fatal 12613 1727096142.68048: done checking for any_errors_fatal 12613 1727096142.68049: checking for max_fail_percentage 12613 1727096142.68051: done checking for max_fail_percentage 12613 1727096142.68051: checking to see if all hosts have failed and the running result is not ok 12613 1727096142.68052: done checking to see if all hosts have failed 12613 1727096142.68053: getting the remaining hosts for this loop 12613 1727096142.68054: done getting the remaining hosts for this loop 12613 1727096142.68058: getting the next task for host managed_node1 12613 1727096142.68067: done getting next task for host managed_node1 12613 1727096142.68073: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12613 1727096142.68076: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096142.68093: getting variables 12613 1727096142.68095: in VariableManager get_vars() 12613 1727096142.68153: Calling all_inventory to load vars for managed_node1 12613 1727096142.68156: Calling groups_inventory to load vars for managed_node1 12613 1727096142.68158: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096142.68273: Calling all_plugins_play to load vars for managed_node1 12613 1727096142.68277: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096142.68281: Calling groups_plugins_play to load vars for managed_node1 12613 1727096142.68465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096142.68879: done with get_vars() 12613 1727096142.68892: done getting variables 12613 1727096142.68951: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:55:42 -0400 (0:00:00.053) 0:00:06.327 ****** 12613 1727096142.68988: entering _queue_task() for managed_node1/debug 12613 1727096142.69307: worker is 1 (out of 1 available) 12613 1727096142.69320: exiting _queue_task() for managed_node1/debug 12613 1727096142.69331: done queuing things up, now waiting for results queue to drain 12613 1727096142.69332: waiting for pending results... 12613 1727096142.69695: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 12613 1727096142.69700: in run() - task 0afff68d-5257-a9dd-d073-00000000007c 12613 1727096142.69737: variable 'ansible_search_path' from source: unknown 12613 1727096142.69744: variable 'ansible_search_path' from source: unknown 12613 1727096142.69784: calling self._execute() 12613 1727096142.69866: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096142.69880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096142.69894: variable 'omit' from source: magic vars 12613 1727096142.70356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096142.73239: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096142.73399: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096142.73450: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096142.73497: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096142.73533: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096142.73618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096142.73655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096142.73686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096142.73733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096142.73776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096142.73944: variable 'ansible_distribution' from source: facts 12613 1727096142.73959: variable 'ansible_distribution_major_version' from source: facts 12613 1727096142.73985: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096142.73993: when evaluation is False, skipping this task 12613 1727096142.73999: _execute() done 12613 1727096142.74021: dumping result to json 12613 1727096142.74023: done dumping result, returning 12613 1727096142.74025: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-a9dd-d073-00000000007c] 12613 1727096142.74172: sending task result for task 0afff68d-5257-a9dd-d073-00000000007c 12613 1727096142.74280: done sending task result for task 0afff68d-5257-a9dd-d073-00000000007c 12613 1727096142.74282: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096142.74325: no more pending results, returning what we have 12613 1727096142.74329: results queue empty 12613 1727096142.74330: checking for any_errors_fatal 12613 1727096142.74336: done checking for any_errors_fatal 12613 1727096142.74336: checking for max_fail_percentage 12613 1727096142.74339: done checking for max_fail_percentage 12613 1727096142.74340: checking to see if all hosts have failed and the running result is not ok 12613 1727096142.74341: done checking to see if all hosts have failed 12613 1727096142.74342: getting the remaining hosts for this loop 12613 1727096142.74343: done getting the remaining hosts for this loop 12613 1727096142.74347: getting the next task for host managed_node1 12613 1727096142.74358: done getting next task for host managed_node1 12613 1727096142.74363: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12613 1727096142.74365: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096142.74492: getting variables 12613 1727096142.74494: in VariableManager get_vars() 12613 1727096142.74556: Calling all_inventory to load vars for managed_node1 12613 1727096142.74559: Calling groups_inventory to load vars for managed_node1 12613 1727096142.74561: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096142.74732: Calling all_plugins_play to load vars for managed_node1 12613 1727096142.74736: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096142.74740: Calling groups_plugins_play to load vars for managed_node1 12613 1727096142.74966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096142.75196: done with get_vars() 12613 1727096142.75207: done getting variables 12613 1727096142.75264: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:55:42 -0400 (0:00:00.063) 0:00:06.390 ****** 12613 1727096142.75299: entering _queue_task() for managed_node1/fail 12613 1727096142.75589: worker is 1 (out of 1 available) 12613 1727096142.75600: exiting _queue_task() for managed_node1/fail 12613 1727096142.75612: done queuing things up, now waiting for results queue to drain 12613 1727096142.75614: waiting for pending results... 12613 1727096142.75985: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12613 1727096142.75991: in run() - task 0afff68d-5257-a9dd-d073-00000000007d 12613 1727096142.75994: variable 'ansible_search_path' from source: unknown 12613 1727096142.75996: variable 'ansible_search_path' from source: unknown 12613 1727096142.76017: calling self._execute() 12613 1727096142.76100: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096142.76111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096142.76124: variable 'omit' from source: magic vars 12613 1727096142.76538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096142.78712: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096142.78801: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096142.78843: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096142.78891: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096142.78922: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096142.79013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096142.79049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096142.79086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096142.79131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096142.79151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096142.79302: variable 'ansible_distribution' from source: facts 12613 1727096142.79313: variable 'ansible_distribution_major_version' from source: facts 12613 1727096142.79337: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096142.79348: when evaluation is False, skipping this task 12613 1727096142.79358: _execute() done 12613 1727096142.79366: dumping result to json 12613 1727096142.79429: done dumping result, returning 12613 1727096142.79434: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-a9dd-d073-00000000007d] 12613 1727096142.79437: sending task result for task 0afff68d-5257-a9dd-d073-00000000007d skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096142.79717: no more pending results, returning what we have 12613 1727096142.79720: results queue empty 12613 1727096142.79721: checking for any_errors_fatal 12613 1727096142.79727: done checking for any_errors_fatal 12613 1727096142.79728: checking for max_fail_percentage 12613 1727096142.79730: done checking for max_fail_percentage 12613 1727096142.79731: checking to see if all hosts have failed and the running result is not ok 12613 1727096142.79732: done checking to see if all hosts have failed 12613 1727096142.79733: getting the remaining hosts for this loop 12613 1727096142.79734: done getting the remaining hosts for this loop 12613 1727096142.79738: getting the next task for host managed_node1 12613 1727096142.79745: done getting next task for host managed_node1 12613 1727096142.79749: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12613 1727096142.79755: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096142.79774: getting variables 12613 1727096142.79777: in VariableManager get_vars() 12613 1727096142.79834: Calling all_inventory to load vars for managed_node1 12613 1727096142.79837: Calling groups_inventory to load vars for managed_node1 12613 1727096142.79840: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096142.79854: Calling all_plugins_play to load vars for managed_node1 12613 1727096142.79858: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096142.79861: Calling groups_plugins_play to load vars for managed_node1 12613 1727096142.80078: done sending task result for task 0afff68d-5257-a9dd-d073-00000000007d 12613 1727096142.80081: WORKER PROCESS EXITING 12613 1727096142.80256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096142.80467: done with get_vars() 12613 1727096142.80480: done getting variables 12613 1727096142.80539: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:55:42 -0400 (0:00:00.052) 0:00:06.442 ****** 12613 1727096142.80579: entering _queue_task() for managed_node1/fail 12613 1727096142.80856: worker is 1 (out of 1 available) 12613 1727096142.80973: exiting _queue_task() for managed_node1/fail 12613 1727096142.80984: done queuing things up, now waiting for results queue to drain 12613 1727096142.80985: waiting for pending results... 12613 1727096142.81177: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12613 1727096142.81355: in run() - task 0afff68d-5257-a9dd-d073-00000000007e 12613 1727096142.81385: variable 'ansible_search_path' from source: unknown 12613 1727096142.81395: variable 'ansible_search_path' from source: unknown 12613 1727096142.81441: calling self._execute() 12613 1727096142.81553: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096142.81565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096142.81585: variable 'omit' from source: magic vars 12613 1727096142.82036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096142.84301: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096142.84377: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096142.84422: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096142.84463: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096142.84494: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096142.84580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096142.84618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096142.84649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096142.84692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096142.84708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096142.84859: variable 'ansible_distribution' from source: facts 12613 1727096142.84946: variable 'ansible_distribution_major_version' from source: facts 12613 1727096142.84950: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096142.84955: when evaluation is False, skipping this task 12613 1727096142.84957: _execute() done 12613 1727096142.84959: dumping result to json 12613 1727096142.84961: done dumping result, returning 12613 1727096142.84964: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-a9dd-d073-00000000007e] 12613 1727096142.84966: sending task result for task 0afff68d-5257-a9dd-d073-00000000007e 12613 1727096142.85037: done sending task result for task 0afff68d-5257-a9dd-d073-00000000007e 12613 1727096142.85039: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096142.85099: no more pending results, returning what we have 12613 1727096142.85103: results queue empty 12613 1727096142.85104: checking for any_errors_fatal 12613 1727096142.85112: done checking for any_errors_fatal 12613 1727096142.85113: checking for max_fail_percentage 12613 1727096142.85115: done checking for max_fail_percentage 12613 1727096142.85116: checking to see if all hosts have failed and the running result is not ok 12613 1727096142.85116: done checking to see if all hosts have failed 12613 1727096142.85117: getting the remaining hosts for this loop 12613 1727096142.85118: done getting the remaining hosts for this loop 12613 1727096142.85122: getting the next task for host managed_node1 12613 1727096142.85129: done getting next task for host managed_node1 12613 1727096142.85133: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12613 1727096142.85136: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096142.85156: getting variables 12613 1727096142.85158: in VariableManager get_vars() 12613 1727096142.85215: Calling all_inventory to load vars for managed_node1 12613 1727096142.85218: Calling groups_inventory to load vars for managed_node1 12613 1727096142.85221: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096142.85231: Calling all_plugins_play to load vars for managed_node1 12613 1727096142.85234: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096142.85238: Calling groups_plugins_play to load vars for managed_node1 12613 1727096142.85779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096142.85987: done with get_vars() 12613 1727096142.85999: done getting variables 12613 1727096142.86057: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:55:42 -0400 (0:00:00.055) 0:00:06.498 ****** 12613 1727096142.86093: entering _queue_task() for managed_node1/fail 12613 1727096142.86353: worker is 1 (out of 1 available) 12613 1727096142.86366: exiting _queue_task() for managed_node1/fail 12613 1727096142.86480: done queuing things up, now waiting for results queue to drain 12613 1727096142.86482: waiting for pending results... 12613 1727096142.86686: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12613 1727096142.86816: in run() - task 0afff68d-5257-a9dd-d073-00000000007f 12613 1727096142.86819: variable 'ansible_search_path' from source: unknown 12613 1727096142.86822: variable 'ansible_search_path' from source: unknown 12613 1727096142.86973: calling self._execute() 12613 1727096142.86977: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096142.86979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096142.86981: variable 'omit' from source: magic vars 12613 1727096142.87410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096142.89695: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096142.89773: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096142.89817: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096142.89856: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096142.89887: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096142.89978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096142.90011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096142.90045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096142.90136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096142.90139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096142.90250: variable 'ansible_distribution' from source: facts 12613 1727096142.90264: variable 'ansible_distribution_major_version' from source: facts 12613 1727096142.90288: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096142.90295: when evaluation is False, skipping this task 12613 1727096142.90301: _execute() done 12613 1727096142.90306: dumping result to json 12613 1727096142.90312: done dumping result, returning 12613 1727096142.90322: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-a9dd-d073-00000000007f] 12613 1727096142.90355: sending task result for task 0afff68d-5257-a9dd-d073-00000000007f skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096142.90616: no more pending results, returning what we have 12613 1727096142.90620: results queue empty 12613 1727096142.90621: checking for any_errors_fatal 12613 1727096142.90626: done checking for any_errors_fatal 12613 1727096142.90628: checking for max_fail_percentage 12613 1727096142.90630: done checking for max_fail_percentage 12613 1727096142.90630: checking to see if all hosts have failed and the running result is not ok 12613 1727096142.90631: done checking to see if all hosts have failed 12613 1727096142.90632: getting the remaining hosts for this loop 12613 1727096142.90634: done getting the remaining hosts for this loop 12613 1727096142.90638: getting the next task for host managed_node1 12613 1727096142.90645: done getting next task for host managed_node1 12613 1727096142.90649: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12613 1727096142.90654: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096142.90674: done sending task result for task 0afff68d-5257-a9dd-d073-00000000007f 12613 1727096142.90677: WORKER PROCESS EXITING 12613 1727096142.90686: getting variables 12613 1727096142.90688: in VariableManager get_vars() 12613 1727096142.90742: Calling all_inventory to load vars for managed_node1 12613 1727096142.90745: Calling groups_inventory to load vars for managed_node1 12613 1727096142.90748: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096142.90761: Calling all_plugins_play to load vars for managed_node1 12613 1727096142.90764: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096142.90767: Calling groups_plugins_play to load vars for managed_node1 12613 1727096142.91058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096142.91348: done with get_vars() 12613 1727096142.91362: done getting variables 12613 1727096142.91421: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:55:42 -0400 (0:00:00.053) 0:00:06.551 ****** 12613 1727096142.91457: entering _queue_task() for managed_node1/dnf 12613 1727096142.91726: worker is 1 (out of 1 available) 12613 1727096142.91737: exiting _queue_task() for managed_node1/dnf 12613 1727096142.91749: done queuing things up, now waiting for results queue to drain 12613 1727096142.91750: waiting for pending results... 12613 1727096142.92017: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12613 1727096142.92146: in run() - task 0afff68d-5257-a9dd-d073-000000000080 12613 1727096142.92170: variable 'ansible_search_path' from source: unknown 12613 1727096142.92179: variable 'ansible_search_path' from source: unknown 12613 1727096142.92219: calling self._execute() 12613 1727096142.92575: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096142.92578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096142.92580: variable 'omit' from source: magic vars 12613 1727096142.93306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096142.96705: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096142.96787: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096142.96832: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096142.96875: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096142.96910: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096142.97003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096142.97040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096142.97076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096142.97125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096142.97145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096142.97295: variable 'ansible_distribution' from source: facts 12613 1727096142.97306: variable 'ansible_distribution_major_version' from source: facts 12613 1727096142.97333: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096142.97341: when evaluation is False, skipping this task 12613 1727096142.97348: _execute() done 12613 1727096142.97358: dumping result to json 12613 1727096142.97366: done dumping result, returning 12613 1727096142.97380: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-a9dd-d073-000000000080] 12613 1727096142.97426: sending task result for task 0afff68d-5257-a9dd-d073-000000000080 12613 1727096142.97507: done sending task result for task 0afff68d-5257-a9dd-d073-000000000080 12613 1727096142.97510: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096142.97588: no more pending results, returning what we have 12613 1727096142.97592: results queue empty 12613 1727096142.97593: checking for any_errors_fatal 12613 1727096142.97601: done checking for any_errors_fatal 12613 1727096142.97601: checking for max_fail_percentage 12613 1727096142.97603: done checking for max_fail_percentage 12613 1727096142.97604: checking to see if all hosts have failed and the running result is not ok 12613 1727096142.97605: done checking to see if all hosts have failed 12613 1727096142.97606: getting the remaining hosts for this loop 12613 1727096142.97607: done getting the remaining hosts for this loop 12613 1727096142.97611: getting the next task for host managed_node1 12613 1727096142.97619: done getting next task for host managed_node1 12613 1727096142.97623: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12613 1727096142.97626: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096142.97645: getting variables 12613 1727096142.97647: in VariableManager get_vars() 12613 1727096142.97710: Calling all_inventory to load vars for managed_node1 12613 1727096142.97714: Calling groups_inventory to load vars for managed_node1 12613 1727096142.97716: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096142.97727: Calling all_plugins_play to load vars for managed_node1 12613 1727096142.97730: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096142.97733: Calling groups_plugins_play to load vars for managed_node1 12613 1727096142.98246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096142.98996: done with get_vars() 12613 1727096142.99009: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12613 1727096142.99087: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:55:42 -0400 (0:00:00.076) 0:00:06.628 ****** 12613 1727096142.99117: entering _queue_task() for managed_node1/yum 12613 1727096142.99929: worker is 1 (out of 1 available) 12613 1727096142.99941: exiting _queue_task() for managed_node1/yum 12613 1727096142.99956: done queuing things up, now waiting for results queue to drain 12613 1727096142.99958: waiting for pending results... 12613 1727096143.00587: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12613 1727096143.00592: in run() - task 0afff68d-5257-a9dd-d073-000000000081 12613 1727096143.00595: variable 'ansible_search_path' from source: unknown 12613 1727096143.00774: variable 'ansible_search_path' from source: unknown 12613 1727096143.00818: calling self._execute() 12613 1727096143.00907: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096143.00916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096143.00927: variable 'omit' from source: magic vars 12613 1727096143.01976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096143.06664: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096143.06960: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096143.06999: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096143.07036: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096143.07063: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096143.07255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096143.07388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096143.07415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096143.07457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096143.07674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096143.07806: variable 'ansible_distribution' from source: facts 12613 1727096143.07812: variable 'ansible_distribution_major_version' from source: facts 12613 1727096143.07902: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096143.07906: when evaluation is False, skipping this task 12613 1727096143.07908: _execute() done 12613 1727096143.07910: dumping result to json 12613 1727096143.07912: done dumping result, returning 12613 1727096143.07914: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-a9dd-d073-000000000081] 12613 1727096143.07917: sending task result for task 0afff68d-5257-a9dd-d073-000000000081 12613 1727096143.08273: done sending task result for task 0afff68d-5257-a9dd-d073-000000000081 12613 1727096143.08276: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096143.08324: no more pending results, returning what we have 12613 1727096143.08327: results queue empty 12613 1727096143.08328: checking for any_errors_fatal 12613 1727096143.08334: done checking for any_errors_fatal 12613 1727096143.08334: checking for max_fail_percentage 12613 1727096143.08336: done checking for max_fail_percentage 12613 1727096143.08337: checking to see if all hosts have failed and the running result is not ok 12613 1727096143.08337: done checking to see if all hosts have failed 12613 1727096143.08338: getting the remaining hosts for this loop 12613 1727096143.08339: done getting the remaining hosts for this loop 12613 1727096143.08343: getting the next task for host managed_node1 12613 1727096143.08349: done getting next task for host managed_node1 12613 1727096143.08356: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12613 1727096143.08359: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096143.08378: getting variables 12613 1727096143.08380: in VariableManager get_vars() 12613 1727096143.08433: Calling all_inventory to load vars for managed_node1 12613 1727096143.08436: Calling groups_inventory to load vars for managed_node1 12613 1727096143.08438: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096143.08448: Calling all_plugins_play to load vars for managed_node1 12613 1727096143.08453: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096143.08457: Calling groups_plugins_play to load vars for managed_node1 12613 1727096143.08835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096143.09257: done with get_vars() 12613 1727096143.09472: done getting variables 12613 1727096143.09528: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:55:43 -0400 (0:00:00.104) 0:00:06.732 ****** 12613 1727096143.09564: entering _queue_task() for managed_node1/fail 12613 1727096143.10051: worker is 1 (out of 1 available) 12613 1727096143.10269: exiting _queue_task() for managed_node1/fail 12613 1727096143.10282: done queuing things up, now waiting for results queue to drain 12613 1727096143.10284: waiting for pending results... 12613 1727096143.10643: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12613 1727096143.10839: in run() - task 0afff68d-5257-a9dd-d073-000000000082 12613 1727096143.10854: variable 'ansible_search_path' from source: unknown 12613 1727096143.10858: variable 'ansible_search_path' from source: unknown 12613 1727096143.11056: calling self._execute() 12613 1727096143.11059: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096143.11062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096143.11065: variable 'omit' from source: magic vars 12613 1727096143.12077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096143.15982: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096143.16064: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096143.16117: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096143.16158: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096143.16191: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096143.16320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096143.16324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096143.16350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096143.16407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096143.16586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096143.16991: variable 'ansible_distribution' from source: facts 12613 1727096143.16994: variable 'ansible_distribution_major_version' from source: facts 12613 1727096143.16996: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096143.16999: when evaluation is False, skipping this task 12613 1727096143.17001: _execute() done 12613 1727096143.17004: dumping result to json 12613 1727096143.17006: done dumping result, returning 12613 1727096143.17008: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-a9dd-d073-000000000082] 12613 1727096143.17011: sending task result for task 0afff68d-5257-a9dd-d073-000000000082 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096143.17141: no more pending results, returning what we have 12613 1727096143.17145: results queue empty 12613 1727096143.17146: checking for any_errors_fatal 12613 1727096143.17157: done checking for any_errors_fatal 12613 1727096143.17158: checking for max_fail_percentage 12613 1727096143.17160: done checking for max_fail_percentage 12613 1727096143.17160: checking to see if all hosts have failed and the running result is not ok 12613 1727096143.17161: done checking to see if all hosts have failed 12613 1727096143.17162: getting the remaining hosts for this loop 12613 1727096143.17163: done getting the remaining hosts for this loop 12613 1727096143.17169: getting the next task for host managed_node1 12613 1727096143.17177: done getting next task for host managed_node1 12613 1727096143.17181: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12613 1727096143.17184: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096143.17202: getting variables 12613 1727096143.17204: in VariableManager get_vars() 12613 1727096143.17265: Calling all_inventory to load vars for managed_node1 12613 1727096143.17382: Calling groups_inventory to load vars for managed_node1 12613 1727096143.17386: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096143.17396: Calling all_plugins_play to load vars for managed_node1 12613 1727096143.17399: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096143.17403: Calling groups_plugins_play to load vars for managed_node1 12613 1727096143.17786: done sending task result for task 0afff68d-5257-a9dd-d073-000000000082 12613 1727096143.17789: WORKER PROCESS EXITING 12613 1727096143.17817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096143.18027: done with get_vars() 12613 1727096143.18042: done getting variables 12613 1727096143.18104: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:55:43 -0400 (0:00:00.085) 0:00:06.818 ****** 12613 1727096143.18136: entering _queue_task() for managed_node1/package 12613 1727096143.18584: worker is 1 (out of 1 available) 12613 1727096143.18594: exiting _queue_task() for managed_node1/package 12613 1727096143.18603: done queuing things up, now waiting for results queue to drain 12613 1727096143.18604: waiting for pending results... 12613 1727096143.18778: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 12613 1727096143.18920: in run() - task 0afff68d-5257-a9dd-d073-000000000083 12613 1727096143.18941: variable 'ansible_search_path' from source: unknown 12613 1727096143.18954: variable 'ansible_search_path' from source: unknown 12613 1727096143.18995: calling self._execute() 12613 1727096143.19095: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096143.19108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096143.19126: variable 'omit' from source: magic vars 12613 1727096143.19581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096143.22046: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096143.22135: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096143.22186: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096143.22225: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096143.22276: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096143.22344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096143.22573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096143.22577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096143.22580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096143.22583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096143.22617: variable 'ansible_distribution' from source: facts 12613 1727096143.22629: variable 'ansible_distribution_major_version' from source: facts 12613 1727096143.22663: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096143.22674: when evaluation is False, skipping this task 12613 1727096143.22680: _execute() done 12613 1727096143.22687: dumping result to json 12613 1727096143.22700: done dumping result, returning 12613 1727096143.22710: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-a9dd-d073-000000000083] 12613 1727096143.22719: sending task result for task 0afff68d-5257-a9dd-d073-000000000083 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096143.23011: no more pending results, returning what we have 12613 1727096143.23015: results queue empty 12613 1727096143.23016: checking for any_errors_fatal 12613 1727096143.23026: done checking for any_errors_fatal 12613 1727096143.23028: checking for max_fail_percentage 12613 1727096143.23030: done checking for max_fail_percentage 12613 1727096143.23031: checking to see if all hosts have failed and the running result is not ok 12613 1727096143.23032: done checking to see if all hosts have failed 12613 1727096143.23033: getting the remaining hosts for this loop 12613 1727096143.23034: done getting the remaining hosts for this loop 12613 1727096143.23037: getting the next task for host managed_node1 12613 1727096143.23044: done getting next task for host managed_node1 12613 1727096143.23048: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12613 1727096143.23050: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096143.23072: getting variables 12613 1727096143.23073: in VariableManager get_vars() 12613 1727096143.23243: Calling all_inventory to load vars for managed_node1 12613 1727096143.23246: Calling groups_inventory to load vars for managed_node1 12613 1727096143.23249: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096143.23258: done sending task result for task 0afff68d-5257-a9dd-d073-000000000083 12613 1727096143.23261: WORKER PROCESS EXITING 12613 1727096143.23271: Calling all_plugins_play to load vars for managed_node1 12613 1727096143.23274: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096143.23277: Calling groups_plugins_play to load vars for managed_node1 12613 1727096143.23604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096143.23825: done with get_vars() 12613 1727096143.23837: done getting variables 12613 1727096143.23901: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:55:43 -0400 (0:00:00.057) 0:00:06.876 ****** 12613 1727096143.23932: entering _queue_task() for managed_node1/package 12613 1727096143.24330: worker is 1 (out of 1 available) 12613 1727096143.24340: exiting _queue_task() for managed_node1/package 12613 1727096143.24348: done queuing things up, now waiting for results queue to drain 12613 1727096143.24349: waiting for pending results... 12613 1727096143.24719: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12613 1727096143.25019: in run() - task 0afff68d-5257-a9dd-d073-000000000084 12613 1727096143.25195: variable 'ansible_search_path' from source: unknown 12613 1727096143.25198: variable 'ansible_search_path' from source: unknown 12613 1727096143.25201: calling self._execute() 12613 1727096143.25397: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096143.25414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096143.25436: variable 'omit' from source: magic vars 12613 1727096143.26170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096143.29363: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096143.29481: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096143.29586: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096143.29623: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096143.29661: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096143.29773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096143.29791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096143.29821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096143.30280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096143.30284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096143.30287: variable 'ansible_distribution' from source: facts 12613 1727096143.30289: variable 'ansible_distribution_major_version' from source: facts 12613 1727096143.30291: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096143.30294: when evaluation is False, skipping this task 12613 1727096143.30296: _execute() done 12613 1727096143.30298: dumping result to json 12613 1727096143.30300: done dumping result, returning 12613 1727096143.30302: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-a9dd-d073-000000000084] 12613 1727096143.30304: sending task result for task 0afff68d-5257-a9dd-d073-000000000084 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096143.30537: no more pending results, returning what we have 12613 1727096143.30540: results queue empty 12613 1727096143.30541: checking for any_errors_fatal 12613 1727096143.30548: done checking for any_errors_fatal 12613 1727096143.30549: checking for max_fail_percentage 12613 1727096143.30553: done checking for max_fail_percentage 12613 1727096143.30554: checking to see if all hosts have failed and the running result is not ok 12613 1727096143.30555: done checking to see if all hosts have failed 12613 1727096143.30556: getting the remaining hosts for this loop 12613 1727096143.30557: done getting the remaining hosts for this loop 12613 1727096143.30561: getting the next task for host managed_node1 12613 1727096143.30570: done getting next task for host managed_node1 12613 1727096143.30575: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12613 1727096143.30577: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096143.30595: getting variables 12613 1727096143.30597: in VariableManager get_vars() 12613 1727096143.30658: Calling all_inventory to load vars for managed_node1 12613 1727096143.30661: Calling groups_inventory to load vars for managed_node1 12613 1727096143.30664: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096143.30878: Calling all_plugins_play to load vars for managed_node1 12613 1727096143.30882: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096143.30888: Calling groups_plugins_play to load vars for managed_node1 12613 1727096143.31336: done sending task result for task 0afff68d-5257-a9dd-d073-000000000084 12613 1727096143.31339: WORKER PROCESS EXITING 12613 1727096143.31359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096143.31750: done with get_vars() 12613 1727096143.31775: done getting variables 12613 1727096143.31838: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:55:43 -0400 (0:00:00.079) 0:00:06.956 ****** 12613 1727096143.31884: entering _queue_task() for managed_node1/package 12613 1727096143.32301: worker is 1 (out of 1 available) 12613 1727096143.32312: exiting _queue_task() for managed_node1/package 12613 1727096143.32322: done queuing things up, now waiting for results queue to drain 12613 1727096143.32323: waiting for pending results... 12613 1727096143.32539: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12613 1727096143.32681: in run() - task 0afff68d-5257-a9dd-d073-000000000085 12613 1727096143.32705: variable 'ansible_search_path' from source: unknown 12613 1727096143.32714: variable 'ansible_search_path' from source: unknown 12613 1727096143.32771: calling self._execute() 12613 1727096143.32880: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096143.32892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096143.32912: variable 'omit' from source: magic vars 12613 1727096143.33406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096143.35943: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096143.36022: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096143.36072: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096143.36116: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096143.36151: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096143.36232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096143.36274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096143.36303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096143.36345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096143.36372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096143.36507: variable 'ansible_distribution' from source: facts 12613 1727096143.36519: variable 'ansible_distribution_major_version' from source: facts 12613 1727096143.36539: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096143.36546: when evaluation is False, skipping this task 12613 1727096143.36553: _execute() done 12613 1727096143.36560: dumping result to json 12613 1727096143.36572: done dumping result, returning 12613 1727096143.36606: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-a9dd-d073-000000000085] 12613 1727096143.36617: sending task result for task 0afff68d-5257-a9dd-d073-000000000085 12613 1727096143.37022: done sending task result for task 0afff68d-5257-a9dd-d073-000000000085 12613 1727096143.37025: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096143.37072: no more pending results, returning what we have 12613 1727096143.37075: results queue empty 12613 1727096143.37076: checking for any_errors_fatal 12613 1727096143.37082: done checking for any_errors_fatal 12613 1727096143.37083: checking for max_fail_percentage 12613 1727096143.37084: done checking for max_fail_percentage 12613 1727096143.37085: checking to see if all hosts have failed and the running result is not ok 12613 1727096143.37086: done checking to see if all hosts have failed 12613 1727096143.37086: getting the remaining hosts for this loop 12613 1727096143.37087: done getting the remaining hosts for this loop 12613 1727096143.37091: getting the next task for host managed_node1 12613 1727096143.37097: done getting next task for host managed_node1 12613 1727096143.37101: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12613 1727096143.37103: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096143.37119: getting variables 12613 1727096143.37120: in VariableManager get_vars() 12613 1727096143.37174: Calling all_inventory to load vars for managed_node1 12613 1727096143.37176: Calling groups_inventory to load vars for managed_node1 12613 1727096143.37178: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096143.37187: Calling all_plugins_play to load vars for managed_node1 12613 1727096143.37190: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096143.37193: Calling groups_plugins_play to load vars for managed_node1 12613 1727096143.37419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096143.37645: done with get_vars() 12613 1727096143.37659: done getting variables 12613 1727096143.37723: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:55:43 -0400 (0:00:00.058) 0:00:07.014 ****** 12613 1727096143.37758: entering _queue_task() for managed_node1/service 12613 1727096143.38044: worker is 1 (out of 1 available) 12613 1727096143.38057: exiting _queue_task() for managed_node1/service 12613 1727096143.38070: done queuing things up, now waiting for results queue to drain 12613 1727096143.38072: waiting for pending results... 12613 1727096143.38348: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12613 1727096143.38474: in run() - task 0afff68d-5257-a9dd-d073-000000000086 12613 1727096143.38494: variable 'ansible_search_path' from source: unknown 12613 1727096143.38673: variable 'ansible_search_path' from source: unknown 12613 1727096143.38677: calling self._execute() 12613 1727096143.38679: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096143.38681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096143.38684: variable 'omit' from source: magic vars 12613 1727096143.39050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096143.41208: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096143.41283: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096143.41330: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096143.41372: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096143.41429: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096143.41539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096143.41577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096143.41631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096143.41679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096143.41699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096143.41850: variable 'ansible_distribution' from source: facts 12613 1727096143.41862: variable 'ansible_distribution_major_version' from source: facts 12613 1727096143.41887: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096143.41895: when evaluation is False, skipping this task 12613 1727096143.41902: _execute() done 12613 1727096143.41909: dumping result to json 12613 1727096143.41940: done dumping result, returning 12613 1727096143.41943: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-a9dd-d073-000000000086] 12613 1727096143.41946: sending task result for task 0afff68d-5257-a9dd-d073-000000000086 12613 1727096143.42275: done sending task result for task 0afff68d-5257-a9dd-d073-000000000086 12613 1727096143.42278: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096143.42319: no more pending results, returning what we have 12613 1727096143.42322: results queue empty 12613 1727096143.42323: checking for any_errors_fatal 12613 1727096143.42329: done checking for any_errors_fatal 12613 1727096143.42329: checking for max_fail_percentage 12613 1727096143.42331: done checking for max_fail_percentage 12613 1727096143.42332: checking to see if all hosts have failed and the running result is not ok 12613 1727096143.42332: done checking to see if all hosts have failed 12613 1727096143.42333: getting the remaining hosts for this loop 12613 1727096143.42334: done getting the remaining hosts for this loop 12613 1727096143.42337: getting the next task for host managed_node1 12613 1727096143.42343: done getting next task for host managed_node1 12613 1727096143.42346: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12613 1727096143.42349: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096143.42364: getting variables 12613 1727096143.42366: in VariableManager get_vars() 12613 1727096143.42421: Calling all_inventory to load vars for managed_node1 12613 1727096143.42423: Calling groups_inventory to load vars for managed_node1 12613 1727096143.42425: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096143.42434: Calling all_plugins_play to load vars for managed_node1 12613 1727096143.42437: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096143.42439: Calling groups_plugins_play to load vars for managed_node1 12613 1727096143.42809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096143.43016: done with get_vars() 12613 1727096143.43026: done getting variables 12613 1727096143.43087: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:55:43 -0400 (0:00:00.053) 0:00:07.068 ****** 12613 1727096143.43118: entering _queue_task() for managed_node1/service 12613 1727096143.43391: worker is 1 (out of 1 available) 12613 1727096143.43403: exiting _queue_task() for managed_node1/service 12613 1727096143.43414: done queuing things up, now waiting for results queue to drain 12613 1727096143.43415: waiting for pending results... 12613 1727096143.43687: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12613 1727096143.43813: in run() - task 0afff68d-5257-a9dd-d073-000000000087 12613 1727096143.43834: variable 'ansible_search_path' from source: unknown 12613 1727096143.43842: variable 'ansible_search_path' from source: unknown 12613 1727096143.43888: calling self._execute() 12613 1727096143.43974: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096143.43989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096143.44008: variable 'omit' from source: magic vars 12613 1727096143.44437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096143.46645: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096143.46740: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096143.46817: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096143.46872: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096143.46884: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096143.47084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096143.47087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096143.47090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096143.47094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096143.47115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096143.47256: variable 'ansible_distribution' from source: facts 12613 1727096143.47270: variable 'ansible_distribution_major_version' from source: facts 12613 1727096143.47296: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096143.47307: when evaluation is False, skipping this task 12613 1727096143.47320: _execute() done 12613 1727096143.47333: dumping result to json 12613 1727096143.47342: done dumping result, returning 12613 1727096143.47355: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-a9dd-d073-000000000087] 12613 1727096143.47365: sending task result for task 0afff68d-5257-a9dd-d073-000000000087 12613 1727096143.47673: done sending task result for task 0afff68d-5257-a9dd-d073-000000000087 12613 1727096143.47676: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12613 1727096143.47716: no more pending results, returning what we have 12613 1727096143.47719: results queue empty 12613 1727096143.47720: checking for any_errors_fatal 12613 1727096143.47727: done checking for any_errors_fatal 12613 1727096143.47728: checking for max_fail_percentage 12613 1727096143.47730: done checking for max_fail_percentage 12613 1727096143.47731: checking to see if all hosts have failed and the running result is not ok 12613 1727096143.47731: done checking to see if all hosts have failed 12613 1727096143.47732: getting the remaining hosts for this loop 12613 1727096143.47733: done getting the remaining hosts for this loop 12613 1727096143.47737: getting the next task for host managed_node1 12613 1727096143.47744: done getting next task for host managed_node1 12613 1727096143.47748: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12613 1727096143.47751: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096143.47770: getting variables 12613 1727096143.47771: in VariableManager get_vars() 12613 1727096143.47826: Calling all_inventory to load vars for managed_node1 12613 1727096143.47829: Calling groups_inventory to load vars for managed_node1 12613 1727096143.47832: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096143.47842: Calling all_plugins_play to load vars for managed_node1 12613 1727096143.47845: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096143.47847: Calling groups_plugins_play to load vars for managed_node1 12613 1727096143.48106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096143.48302: done with get_vars() 12613 1727096143.48315: done getting variables 12613 1727096143.48373: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:55:43 -0400 (0:00:00.052) 0:00:07.121 ****** 12613 1727096143.48406: entering _queue_task() for managed_node1/service 12613 1727096143.48699: worker is 1 (out of 1 available) 12613 1727096143.48710: exiting _queue_task() for managed_node1/service 12613 1727096143.48721: done queuing things up, now waiting for results queue to drain 12613 1727096143.48722: waiting for pending results... 12613 1727096143.48972: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12613 1727096143.49105: in run() - task 0afff68d-5257-a9dd-d073-000000000088 12613 1727096143.49123: variable 'ansible_search_path' from source: unknown 12613 1727096143.49130: variable 'ansible_search_path' from source: unknown 12613 1727096143.49173: calling self._execute() 12613 1727096143.49373: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096143.49376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096143.49379: variable 'omit' from source: magic vars 12613 1727096143.49748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096143.52727: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096143.52834: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096143.52969: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096143.52973: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096143.52975: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096143.53101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096143.53138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096143.53171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096143.53220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096143.53240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096143.53392: variable 'ansible_distribution' from source: facts 12613 1727096143.53410: variable 'ansible_distribution_major_version' from source: facts 12613 1727096143.53433: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096143.53442: when evaluation is False, skipping this task 12613 1727096143.53449: _execute() done 12613 1727096143.53456: dumping result to json 12613 1727096143.53464: done dumping result, returning 12613 1727096143.53485: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-a9dd-d073-000000000088] 12613 1727096143.53509: sending task result for task 0afff68d-5257-a9dd-d073-000000000088 12613 1727096143.53692: done sending task result for task 0afff68d-5257-a9dd-d073-000000000088 12613 1727096143.53696: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096143.53771: no more pending results, returning what we have 12613 1727096143.53776: results queue empty 12613 1727096143.53777: checking for any_errors_fatal 12613 1727096143.53784: done checking for any_errors_fatal 12613 1727096143.53784: checking for max_fail_percentage 12613 1727096143.53786: done checking for max_fail_percentage 12613 1727096143.53787: checking to see if all hosts have failed and the running result is not ok 12613 1727096143.53788: done checking to see if all hosts have failed 12613 1727096143.53788: getting the remaining hosts for this loop 12613 1727096143.53790: done getting the remaining hosts for this loop 12613 1727096143.53794: getting the next task for host managed_node1 12613 1727096143.53801: done getting next task for host managed_node1 12613 1727096143.53805: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12613 1727096143.53808: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096143.53827: getting variables 12613 1727096143.53830: in VariableManager get_vars() 12613 1727096143.53891: Calling all_inventory to load vars for managed_node1 12613 1727096143.53894: Calling groups_inventory to load vars for managed_node1 12613 1727096143.53897: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096143.53907: Calling all_plugins_play to load vars for managed_node1 12613 1727096143.53910: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096143.53913: Calling groups_plugins_play to load vars for managed_node1 12613 1727096143.54249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096143.54588: done with get_vars() 12613 1727096143.54600: done getting variables 12613 1727096143.54704: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:55:43 -0400 (0:00:00.063) 0:00:07.184 ****** 12613 1727096143.54735: entering _queue_task() for managed_node1/service 12613 1727096143.55440: worker is 1 (out of 1 available) 12613 1727096143.55452: exiting _queue_task() for managed_node1/service 12613 1727096143.55462: done queuing things up, now waiting for results queue to drain 12613 1727096143.55463: waiting for pending results... 12613 1727096143.56003: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 12613 1727096143.56049: in run() - task 0afff68d-5257-a9dd-d073-000000000089 12613 1727096143.56069: variable 'ansible_search_path' from source: unknown 12613 1727096143.56078: variable 'ansible_search_path' from source: unknown 12613 1727096143.56174: calling self._execute() 12613 1727096143.56210: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096143.56221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096143.56237: variable 'omit' from source: magic vars 12613 1727096143.56682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096143.59059: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096143.59153: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096143.59196: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096143.59233: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096143.59473: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096143.59476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096143.59479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096143.59481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096143.59483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096143.59485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096143.59601: variable 'ansible_distribution' from source: facts 12613 1727096143.59613: variable 'ansible_distribution_major_version' from source: facts 12613 1727096143.59634: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096143.59641: when evaluation is False, skipping this task 12613 1727096143.59647: _execute() done 12613 1727096143.59656: dumping result to json 12613 1727096143.59663: done dumping result, returning 12613 1727096143.59676: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-a9dd-d073-000000000089] 12613 1727096143.59685: sending task result for task 0afff68d-5257-a9dd-d073-000000000089 12613 1727096143.59792: done sending task result for task 0afff68d-5257-a9dd-d073-000000000089 12613 1727096143.59800: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12613 1727096143.59855: no more pending results, returning what we have 12613 1727096143.59858: results queue empty 12613 1727096143.59860: checking for any_errors_fatal 12613 1727096143.59870: done checking for any_errors_fatal 12613 1727096143.59871: checking for max_fail_percentage 12613 1727096143.59873: done checking for max_fail_percentage 12613 1727096143.59874: checking to see if all hosts have failed and the running result is not ok 12613 1727096143.59875: done checking to see if all hosts have failed 12613 1727096143.59876: getting the remaining hosts for this loop 12613 1727096143.59877: done getting the remaining hosts for this loop 12613 1727096143.59881: getting the next task for host managed_node1 12613 1727096143.59888: done getting next task for host managed_node1 12613 1727096143.59892: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12613 1727096143.59895: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096143.59913: getting variables 12613 1727096143.59914: in VariableManager get_vars() 12613 1727096143.60045: Calling all_inventory to load vars for managed_node1 12613 1727096143.60048: Calling groups_inventory to load vars for managed_node1 12613 1727096143.60051: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096143.60062: Calling all_plugins_play to load vars for managed_node1 12613 1727096143.60065: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096143.60070: Calling groups_plugins_play to load vars for managed_node1 12613 1727096143.60535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096143.60739: done with get_vars() 12613 1727096143.60752: done getting variables 12613 1727096143.60816: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:55:43 -0400 (0:00:00.061) 0:00:07.245 ****** 12613 1727096143.60849: entering _queue_task() for managed_node1/copy 12613 1727096143.61333: worker is 1 (out of 1 available) 12613 1727096143.61344: exiting _queue_task() for managed_node1/copy 12613 1727096143.61354: done queuing things up, now waiting for results queue to drain 12613 1727096143.61355: waiting for pending results... 12613 1727096143.61588: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12613 1727096143.61724: in run() - task 0afff68d-5257-a9dd-d073-00000000008a 12613 1727096143.61873: variable 'ansible_search_path' from source: unknown 12613 1727096143.61876: variable 'ansible_search_path' from source: unknown 12613 1727096143.61879: calling self._execute() 12613 1727096143.61882: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096143.61884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096143.61901: variable 'omit' from source: magic vars 12613 1727096143.62332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096143.65680: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096143.65751: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096143.65857: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096143.65964: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096143.65997: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096143.66127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096143.66298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096143.66329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096143.66692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096143.66696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096143.66698: variable 'ansible_distribution' from source: facts 12613 1727096143.66700: variable 'ansible_distribution_major_version' from source: facts 12613 1727096143.66702: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096143.66704: when evaluation is False, skipping this task 12613 1727096143.66706: _execute() done 12613 1727096143.66708: dumping result to json 12613 1727096143.66710: done dumping result, returning 12613 1727096143.66712: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-a9dd-d073-00000000008a] 12613 1727096143.66714: sending task result for task 0afff68d-5257-a9dd-d073-00000000008a skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096143.67052: no more pending results, returning what we have 12613 1727096143.67056: results queue empty 12613 1727096143.67057: checking for any_errors_fatal 12613 1727096143.67065: done checking for any_errors_fatal 12613 1727096143.67066: checking for max_fail_percentage 12613 1727096143.67070: done checking for max_fail_percentage 12613 1727096143.67071: checking to see if all hosts have failed and the running result is not ok 12613 1727096143.67072: done checking to see if all hosts have failed 12613 1727096143.67073: getting the remaining hosts for this loop 12613 1727096143.67074: done getting the remaining hosts for this loop 12613 1727096143.67079: getting the next task for host managed_node1 12613 1727096143.67086: done getting next task for host managed_node1 12613 1727096143.67090: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12613 1727096143.67093: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096143.67110: getting variables 12613 1727096143.67112: in VariableManager get_vars() 12613 1727096143.67331: Calling all_inventory to load vars for managed_node1 12613 1727096143.67335: Calling groups_inventory to load vars for managed_node1 12613 1727096143.67338: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096143.67349: Calling all_plugins_play to load vars for managed_node1 12613 1727096143.67353: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096143.67356: Calling groups_plugins_play to load vars for managed_node1 12613 1727096143.68098: done sending task result for task 0afff68d-5257-a9dd-d073-00000000008a 12613 1727096143.68102: WORKER PROCESS EXITING 12613 1727096143.68126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096143.68914: done with get_vars() 12613 1727096143.68928: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:55:43 -0400 (0:00:00.081) 0:00:07.327 ****** 12613 1727096143.69015: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 12613 1727096143.69502: worker is 1 (out of 1 available) 12613 1727096143.69511: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 12613 1727096143.69520: done queuing things up, now waiting for results queue to drain 12613 1727096143.69522: waiting for pending results... 12613 1727096143.69646: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12613 1727096143.69777: in run() - task 0afff68d-5257-a9dd-d073-00000000008b 12613 1727096143.69799: variable 'ansible_search_path' from source: unknown 12613 1727096143.69808: variable 'ansible_search_path' from source: unknown 12613 1727096143.69848: calling self._execute() 12613 1727096143.69938: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096143.69948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096143.69963: variable 'omit' from source: magic vars 12613 1727096143.70398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096143.72620: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096143.72988: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096143.73037: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096143.73078: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096143.73108: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096143.73194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096143.73243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096143.73276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096143.73327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096143.73350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096143.73494: variable 'ansible_distribution' from source: facts 12613 1727096143.73543: variable 'ansible_distribution_major_version' from source: facts 12613 1727096143.73547: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096143.73549: when evaluation is False, skipping this task 12613 1727096143.73553: _execute() done 12613 1727096143.73555: dumping result to json 12613 1727096143.73558: done dumping result, returning 12613 1727096143.73672: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-a9dd-d073-00000000008b] 12613 1727096143.73676: sending task result for task 0afff68d-5257-a9dd-d073-00000000008b 12613 1727096143.73747: done sending task result for task 0afff68d-5257-a9dd-d073-00000000008b 12613 1727096143.73750: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096143.73804: no more pending results, returning what we have 12613 1727096143.73807: results queue empty 12613 1727096143.73809: checking for any_errors_fatal 12613 1727096143.73813: done checking for any_errors_fatal 12613 1727096143.73814: checking for max_fail_percentage 12613 1727096143.73816: done checking for max_fail_percentage 12613 1727096143.73817: checking to see if all hosts have failed and the running result is not ok 12613 1727096143.73818: done checking to see if all hosts have failed 12613 1727096143.73819: getting the remaining hosts for this loop 12613 1727096143.73821: done getting the remaining hosts for this loop 12613 1727096143.73825: getting the next task for host managed_node1 12613 1727096143.73833: done getting next task for host managed_node1 12613 1727096143.73837: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12613 1727096143.73840: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096143.73856: getting variables 12613 1727096143.73858: in VariableManager get_vars() 12613 1727096143.73918: Calling all_inventory to load vars for managed_node1 12613 1727096143.73921: Calling groups_inventory to load vars for managed_node1 12613 1727096143.73924: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096143.73934: Calling all_plugins_play to load vars for managed_node1 12613 1727096143.73936: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096143.73939: Calling groups_plugins_play to load vars for managed_node1 12613 1727096143.74409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096143.74615: done with get_vars() 12613 1727096143.74628: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:55:43 -0400 (0:00:00.056) 0:00:07.384 ****** 12613 1727096143.74714: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 12613 1727096143.75172: worker is 1 (out of 1 available) 12613 1727096143.75182: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 12613 1727096143.75192: done queuing things up, now waiting for results queue to drain 12613 1727096143.75193: waiting for pending results... 12613 1727096143.75385: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 12613 1727096143.75474: in run() - task 0afff68d-5257-a9dd-d073-00000000008c 12613 1727096143.75541: variable 'ansible_search_path' from source: unknown 12613 1727096143.75544: variable 'ansible_search_path' from source: unknown 12613 1727096143.75547: calling self._execute() 12613 1727096143.75628: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096143.75640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096143.75671: variable 'omit' from source: magic vars 12613 1727096143.76117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096143.80733: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096143.80807: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096143.80846: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096143.80892: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096143.80923: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096143.81009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096143.81099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096143.81103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096143.81118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096143.81138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096143.81272: variable 'ansible_distribution' from source: facts 12613 1727096143.81284: variable 'ansible_distribution_major_version' from source: facts 12613 1727096143.81305: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096143.81317: when evaluation is False, skipping this task 12613 1727096143.81323: _execute() done 12613 1727096143.81330: dumping result to json 12613 1727096143.81337: done dumping result, returning 12613 1727096143.81346: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-a9dd-d073-00000000008c] 12613 1727096143.81354: sending task result for task 0afff68d-5257-a9dd-d073-00000000008c 12613 1727096143.81496: done sending task result for task 0afff68d-5257-a9dd-d073-00000000008c 12613 1727096143.81499: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096143.81580: no more pending results, returning what we have 12613 1727096143.81584: results queue empty 12613 1727096143.81585: checking for any_errors_fatal 12613 1727096143.81592: done checking for any_errors_fatal 12613 1727096143.81593: checking for max_fail_percentage 12613 1727096143.81595: done checking for max_fail_percentage 12613 1727096143.81596: checking to see if all hosts have failed and the running result is not ok 12613 1727096143.81596: done checking to see if all hosts have failed 12613 1727096143.81597: getting the remaining hosts for this loop 12613 1727096143.81599: done getting the remaining hosts for this loop 12613 1727096143.81604: getting the next task for host managed_node1 12613 1727096143.81610: done getting next task for host managed_node1 12613 1727096143.81615: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12613 1727096143.81617: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096143.81634: getting variables 12613 1727096143.81636: in VariableManager get_vars() 12613 1727096143.81693: Calling all_inventory to load vars for managed_node1 12613 1727096143.81696: Calling groups_inventory to load vars for managed_node1 12613 1727096143.81698: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096143.81709: Calling all_plugins_play to load vars for managed_node1 12613 1727096143.81713: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096143.81716: Calling groups_plugins_play to load vars for managed_node1 12613 1727096143.82061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096143.82469: done with get_vars() 12613 1727096143.82480: done getting variables 12613 1727096143.82537: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:55:43 -0400 (0:00:00.078) 0:00:07.462 ****** 12613 1727096143.82571: entering _queue_task() for managed_node1/debug 12613 1727096143.82856: worker is 1 (out of 1 available) 12613 1727096143.83073: exiting _queue_task() for managed_node1/debug 12613 1727096143.83083: done queuing things up, now waiting for results queue to drain 12613 1727096143.83084: waiting for pending results... 12613 1727096143.83210: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12613 1727096143.83328: in run() - task 0afff68d-5257-a9dd-d073-00000000008d 12613 1727096143.83332: variable 'ansible_search_path' from source: unknown 12613 1727096143.83335: variable 'ansible_search_path' from source: unknown 12613 1727096143.83417: calling self._execute() 12613 1727096143.83481: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096143.83492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096143.83508: variable 'omit' from source: magic vars 12613 1727096143.84029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096143.86876: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096143.86880: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096143.87000: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096143.87073: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096143.87174: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096143.87265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096143.87359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096143.87463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096143.87775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096143.87779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096143.87782: variable 'ansible_distribution' from source: facts 12613 1727096143.87784: variable 'ansible_distribution_major_version' from source: facts 12613 1727096143.87864: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096143.87882: when evaluation is False, skipping this task 12613 1727096143.87915: _execute() done 12613 1727096143.87938: dumping result to json 12613 1727096143.87946: done dumping result, returning 12613 1727096143.87958: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-a9dd-d073-00000000008d] 12613 1727096143.87971: sending task result for task 0afff68d-5257-a9dd-d073-00000000008d skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096143.88142: no more pending results, returning what we have 12613 1727096143.88146: results queue empty 12613 1727096143.88147: checking for any_errors_fatal 12613 1727096143.88155: done checking for any_errors_fatal 12613 1727096143.88156: checking for max_fail_percentage 12613 1727096143.88157: done checking for max_fail_percentage 12613 1727096143.88158: checking to see if all hosts have failed and the running result is not ok 12613 1727096143.88159: done checking to see if all hosts have failed 12613 1727096143.88160: getting the remaining hosts for this loop 12613 1727096143.88162: done getting the remaining hosts for this loop 12613 1727096143.88166: getting the next task for host managed_node1 12613 1727096143.88176: done getting next task for host managed_node1 12613 1727096143.88181: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12613 1727096143.88185: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096143.88202: getting variables 12613 1727096143.88204: in VariableManager get_vars() 12613 1727096143.88265: Calling all_inventory to load vars for managed_node1 12613 1727096143.88572: Calling groups_inventory to load vars for managed_node1 12613 1727096143.88575: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096143.88584: Calling all_plugins_play to load vars for managed_node1 12613 1727096143.88586: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096143.88589: Calling groups_plugins_play to load vars for managed_node1 12613 1727096143.88748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096143.89045: done with get_vars() 12613 1727096143.89058: done getting variables 12613 1727096143.89096: done sending task result for task 0afff68d-5257-a9dd-d073-00000000008d 12613 1727096143.89099: WORKER PROCESS EXITING 12613 1727096143.89130: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:55:43 -0400 (0:00:00.065) 0:00:07.528 ****** 12613 1727096143.89162: entering _queue_task() for managed_node1/debug 12613 1727096143.89436: worker is 1 (out of 1 available) 12613 1727096143.89449: exiting _queue_task() for managed_node1/debug 12613 1727096143.89461: done queuing things up, now waiting for results queue to drain 12613 1727096143.89463: waiting for pending results... 12613 1727096143.89851: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12613 1727096143.89963: in run() - task 0afff68d-5257-a9dd-d073-00000000008e 12613 1727096143.90092: variable 'ansible_search_path' from source: unknown 12613 1727096143.90096: variable 'ansible_search_path' from source: unknown 12613 1727096143.90134: calling self._execute() 12613 1727096143.90333: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096143.90340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096143.90351: variable 'omit' from source: magic vars 12613 1727096143.91337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096143.94699: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096143.94703: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096143.94748: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096143.94843: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096143.94871: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096143.95064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096143.95091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096143.95112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096143.95287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096143.95291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096143.95525: variable 'ansible_distribution' from source: facts 12613 1727096143.95531: variable 'ansible_distribution_major_version' from source: facts 12613 1727096143.95549: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096143.95552: when evaluation is False, skipping this task 12613 1727096143.95557: _execute() done 12613 1727096143.95560: dumping result to json 12613 1727096143.95568: done dumping result, returning 12613 1727096143.95678: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-a9dd-d073-00000000008e] 12613 1727096143.95683: sending task result for task 0afff68d-5257-a9dd-d073-00000000008e skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096143.95923: no more pending results, returning what we have 12613 1727096143.95926: results queue empty 12613 1727096143.95927: checking for any_errors_fatal 12613 1727096143.95933: done checking for any_errors_fatal 12613 1727096143.95934: checking for max_fail_percentage 12613 1727096143.95935: done checking for max_fail_percentage 12613 1727096143.95936: checking to see if all hosts have failed and the running result is not ok 12613 1727096143.95937: done checking to see if all hosts have failed 12613 1727096143.95938: getting the remaining hosts for this loop 12613 1727096143.95939: done getting the remaining hosts for this loop 12613 1727096143.95943: getting the next task for host managed_node1 12613 1727096143.95949: done getting next task for host managed_node1 12613 1727096143.95953: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12613 1727096143.95956: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096143.95973: getting variables 12613 1727096143.95975: in VariableManager get_vars() 12613 1727096143.96028: Calling all_inventory to load vars for managed_node1 12613 1727096143.96031: Calling groups_inventory to load vars for managed_node1 12613 1727096143.96033: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096143.96043: Calling all_plugins_play to load vars for managed_node1 12613 1727096143.96045: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096143.96047: Calling groups_plugins_play to load vars for managed_node1 12613 1727096143.96559: done sending task result for task 0afff68d-5257-a9dd-d073-00000000008e 12613 1727096143.96562: WORKER PROCESS EXITING 12613 1727096143.96597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096143.96841: done with get_vars() 12613 1727096143.96853: done getting variables 12613 1727096143.96916: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:55:43 -0400 (0:00:00.077) 0:00:07.606 ****** 12613 1727096143.96947: entering _queue_task() for managed_node1/debug 12613 1727096143.97250: worker is 1 (out of 1 available) 12613 1727096143.97263: exiting _queue_task() for managed_node1/debug 12613 1727096143.97278: done queuing things up, now waiting for results queue to drain 12613 1727096143.97279: waiting for pending results... 12613 1727096143.97530: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12613 1727096143.97659: in run() - task 0afff68d-5257-a9dd-d073-00000000008f 12613 1727096143.97680: variable 'ansible_search_path' from source: unknown 12613 1727096143.97687: variable 'ansible_search_path' from source: unknown 12613 1727096143.97724: calling self._execute() 12613 1727096143.97809: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096143.97819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096143.97834: variable 'omit' from source: magic vars 12613 1727096143.98258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096144.00945: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096144.01057: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096144.01130: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096144.01179: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096144.01211: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096144.01317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096144.01374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096144.01474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096144.01478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096144.01481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096144.01616: variable 'ansible_distribution' from source: facts 12613 1727096144.02091: variable 'ansible_distribution_major_version' from source: facts 12613 1727096144.02094: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096144.02097: when evaluation is False, skipping this task 12613 1727096144.02099: _execute() done 12613 1727096144.02101: dumping result to json 12613 1727096144.02104: done dumping result, returning 12613 1727096144.02106: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-a9dd-d073-00000000008f] 12613 1727096144.02108: sending task result for task 0afff68d-5257-a9dd-d073-00000000008f 12613 1727096144.02191: done sending task result for task 0afff68d-5257-a9dd-d073-00000000008f 12613 1727096144.02195: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096144.02248: no more pending results, returning what we have 12613 1727096144.02251: results queue empty 12613 1727096144.02252: checking for any_errors_fatal 12613 1727096144.02259: done checking for any_errors_fatal 12613 1727096144.02260: checking for max_fail_percentage 12613 1727096144.02261: done checking for max_fail_percentage 12613 1727096144.02262: checking to see if all hosts have failed and the running result is not ok 12613 1727096144.02263: done checking to see if all hosts have failed 12613 1727096144.02264: getting the remaining hosts for this loop 12613 1727096144.02265: done getting the remaining hosts for this loop 12613 1727096144.02270: getting the next task for host managed_node1 12613 1727096144.02276: done getting next task for host managed_node1 12613 1727096144.02280: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12613 1727096144.02282: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096144.02299: getting variables 12613 1727096144.02301: in VariableManager get_vars() 12613 1727096144.02472: Calling all_inventory to load vars for managed_node1 12613 1727096144.02475: Calling groups_inventory to load vars for managed_node1 12613 1727096144.02478: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096144.02772: Calling all_plugins_play to load vars for managed_node1 12613 1727096144.02777: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096144.02781: Calling groups_plugins_play to load vars for managed_node1 12613 1727096144.03245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096144.03522: done with get_vars() 12613 1727096144.03534: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:55:44 -0400 (0:00:00.066) 0:00:07.673 ****** 12613 1727096144.03636: entering _queue_task() for managed_node1/ping 12613 1727096144.03929: worker is 1 (out of 1 available) 12613 1727096144.03942: exiting _queue_task() for managed_node1/ping 12613 1727096144.03955: done queuing things up, now waiting for results queue to drain 12613 1727096144.03957: waiting for pending results... 12613 1727096144.04233: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 12613 1727096144.04363: in run() - task 0afff68d-5257-a9dd-d073-000000000090 12613 1727096144.04387: variable 'ansible_search_path' from source: unknown 12613 1727096144.04397: variable 'ansible_search_path' from source: unknown 12613 1727096144.04436: calling self._execute() 12613 1727096144.04527: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096144.04572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096144.04575: variable 'omit' from source: magic vars 12613 1727096144.04999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096144.10773: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096144.10813: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096144.10900: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096144.11001: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096144.11273: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096144.11295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096144.11371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096144.11446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096144.11780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096144.11785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096144.12103: variable 'ansible_distribution' from source: facts 12613 1727096144.12118: variable 'ansible_distribution_major_version' from source: facts 12613 1727096144.12143: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096144.12155: when evaluation is False, skipping this task 12613 1727096144.12163: _execute() done 12613 1727096144.12173: dumping result to json 12613 1727096144.12181: done dumping result, returning 12613 1727096144.12193: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-a9dd-d073-000000000090] 12613 1727096144.12216: sending task result for task 0afff68d-5257-a9dd-d073-000000000090 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096144.12408: no more pending results, returning what we have 12613 1727096144.12412: results queue empty 12613 1727096144.12413: checking for any_errors_fatal 12613 1727096144.12420: done checking for any_errors_fatal 12613 1727096144.12421: checking for max_fail_percentage 12613 1727096144.12422: done checking for max_fail_percentage 12613 1727096144.12423: checking to see if all hosts have failed and the running result is not ok 12613 1727096144.12425: done checking to see if all hosts have failed 12613 1727096144.12426: getting the remaining hosts for this loop 12613 1727096144.12427: done getting the remaining hosts for this loop 12613 1727096144.12431: getting the next task for host managed_node1 12613 1727096144.12440: done getting next task for host managed_node1 12613 1727096144.12443: ^ task is: TASK: meta (role_complete) 12613 1727096144.12445: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096144.12466: getting variables 12613 1727096144.12470: in VariableManager get_vars() 12613 1727096144.12527: Calling all_inventory to load vars for managed_node1 12613 1727096144.12530: Calling groups_inventory to load vars for managed_node1 12613 1727096144.12532: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096144.12542: Calling all_plugins_play to load vars for managed_node1 12613 1727096144.12545: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096144.12548: Calling groups_plugins_play to load vars for managed_node1 12613 1727096144.13373: done sending task result for task 0afff68d-5257-a9dd-d073-000000000090 12613 1727096144.13377: WORKER PROCESS EXITING 12613 1727096144.13425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096144.13634: done with get_vars() 12613 1727096144.13645: done getting variables 12613 1727096144.13725: done queuing things up, now waiting for results queue to drain 12613 1727096144.13727: results queue empty 12613 1727096144.13728: checking for any_errors_fatal 12613 1727096144.13730: done checking for any_errors_fatal 12613 1727096144.13731: checking for max_fail_percentage 12613 1727096144.13732: done checking for max_fail_percentage 12613 1727096144.13733: checking to see if all hosts have failed and the running result is not ok 12613 1727096144.13734: done checking to see if all hosts have failed 12613 1727096144.13734: getting the remaining hosts for this loop 12613 1727096144.13735: done getting the remaining hosts for this loop 12613 1727096144.13738: getting the next task for host managed_node1 12613 1727096144.13742: done getting next task for host managed_node1 12613 1727096144.13744: ^ task is: TASK: From the active connection, get the port1 profile "{{ port1_profile }}" 12613 1727096144.13746: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096144.13748: getting variables 12613 1727096144.13749: in VariableManager get_vars() 12613 1727096144.13772: Calling all_inventory to load vars for managed_node1 12613 1727096144.13775: Calling groups_inventory to load vars for managed_node1 12613 1727096144.13777: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096144.13782: Calling all_plugins_play to load vars for managed_node1 12613 1727096144.13784: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096144.13786: Calling groups_plugins_play to load vars for managed_node1 12613 1727096144.13924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096144.14125: done with get_vars() 12613 1727096144.14133: done getting variables 12613 1727096144.14177: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12613 1727096144.14298: variable 'port1_profile' from source: play vars TASK [From the active connection, get the port1 profile "bond0.0"] ************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:104 Monday 23 September 2024 08:55:44 -0400 (0:00:00.106) 0:00:07.780 ****** 12613 1727096144.14324: entering _queue_task() for managed_node1/command 12613 1727096144.14629: worker is 1 (out of 1 available) 12613 1727096144.14640: exiting _queue_task() for managed_node1/command 12613 1727096144.14654: done queuing things up, now waiting for results queue to drain 12613 1727096144.14655: waiting for pending results... 12613 1727096144.14931: running TaskExecutor() for managed_node1/TASK: From the active connection, get the port1 profile "bond0.0" 12613 1727096144.15038: in run() - task 0afff68d-5257-a9dd-d073-0000000000c0 12613 1727096144.15060: variable 'ansible_search_path' from source: unknown 12613 1727096144.15109: calling self._execute() 12613 1727096144.15205: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096144.15216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096144.15232: variable 'omit' from source: magic vars 12613 1727096144.15697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096144.18422: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096144.18499: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096144.18561: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096144.18589: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096144.18619: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096144.18774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096144.18778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096144.18781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096144.18810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096144.18828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096144.18977: variable 'ansible_distribution' from source: facts 12613 1727096144.18992: variable 'ansible_distribution_major_version' from source: facts 12613 1727096144.19013: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096144.19021: when evaluation is False, skipping this task 12613 1727096144.19028: _execute() done 12613 1727096144.19034: dumping result to json 12613 1727096144.19040: done dumping result, returning 12613 1727096144.19051: done running TaskExecutor() for managed_node1/TASK: From the active connection, get the port1 profile "bond0.0" [0afff68d-5257-a9dd-d073-0000000000c0] 12613 1727096144.19062: sending task result for task 0afff68d-5257-a9dd-d073-0000000000c0 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096144.19258: no more pending results, returning what we have 12613 1727096144.19262: results queue empty 12613 1727096144.19263: checking for any_errors_fatal 12613 1727096144.19264: done checking for any_errors_fatal 12613 1727096144.19265: checking for max_fail_percentage 12613 1727096144.19268: done checking for max_fail_percentage 12613 1727096144.19269: checking to see if all hosts have failed and the running result is not ok 12613 1727096144.19270: done checking to see if all hosts have failed 12613 1727096144.19271: getting the remaining hosts for this loop 12613 1727096144.19273: done getting the remaining hosts for this loop 12613 1727096144.19276: getting the next task for host managed_node1 12613 1727096144.19283: done getting next task for host managed_node1 12613 1727096144.19286: ^ task is: TASK: From the active connection, get the port2 profile "{{ port2_profile }}" 12613 1727096144.19288: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096144.19291: getting variables 12613 1727096144.19293: in VariableManager get_vars() 12613 1727096144.19349: Calling all_inventory to load vars for managed_node1 12613 1727096144.19355: Calling groups_inventory to load vars for managed_node1 12613 1727096144.19358: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096144.19472: Calling all_plugins_play to load vars for managed_node1 12613 1727096144.19477: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096144.19483: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000c0 12613 1727096144.19486: WORKER PROCESS EXITING 12613 1727096144.19490: Calling groups_plugins_play to load vars for managed_node1 12613 1727096144.19927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096144.20128: done with get_vars() 12613 1727096144.20139: done getting variables 12613 1727096144.20202: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12613 1727096144.20321: variable 'port2_profile' from source: play vars TASK [From the active connection, get the port2 profile "bond0.1"] ************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:111 Monday 23 September 2024 08:55:44 -0400 (0:00:00.060) 0:00:07.840 ****** 12613 1727096144.20347: entering _queue_task() for managed_node1/command 12613 1727096144.20623: worker is 1 (out of 1 available) 12613 1727096144.20635: exiting _queue_task() for managed_node1/command 12613 1727096144.20649: done queuing things up, now waiting for results queue to drain 12613 1727096144.20650: waiting for pending results... 12613 1727096144.20902: running TaskExecutor() for managed_node1/TASK: From the active connection, get the port2 profile "bond0.1" 12613 1727096144.20983: in run() - task 0afff68d-5257-a9dd-d073-0000000000c1 12613 1727096144.20997: variable 'ansible_search_path' from source: unknown 12613 1727096144.21031: calling self._execute() 12613 1727096144.21116: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096144.21122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096144.21133: variable 'omit' from source: magic vars 12613 1727096144.21562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096144.24108: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096144.24212: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096144.24274: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096144.24348: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096144.24560: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096144.24697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096144.24731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096144.24764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096144.24812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096144.24830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096144.24977: variable 'ansible_distribution' from source: facts 12613 1727096144.24995: variable 'ansible_distribution_major_version' from source: facts 12613 1727096144.25019: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096144.25026: when evaluation is False, skipping this task 12613 1727096144.25033: _execute() done 12613 1727096144.25039: dumping result to json 12613 1727096144.25046: done dumping result, returning 12613 1727096144.25060: done running TaskExecutor() for managed_node1/TASK: From the active connection, get the port2 profile "bond0.1" [0afff68d-5257-a9dd-d073-0000000000c1] 12613 1727096144.25099: sending task result for task 0afff68d-5257-a9dd-d073-0000000000c1 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096144.25256: no more pending results, returning what we have 12613 1727096144.25260: results queue empty 12613 1727096144.25261: checking for any_errors_fatal 12613 1727096144.25266: done checking for any_errors_fatal 12613 1727096144.25266: checking for max_fail_percentage 12613 1727096144.25270: done checking for max_fail_percentage 12613 1727096144.25271: checking to see if all hosts have failed and the running result is not ok 12613 1727096144.25272: done checking to see if all hosts have failed 12613 1727096144.25273: getting the remaining hosts for this loop 12613 1727096144.25274: done getting the remaining hosts for this loop 12613 1727096144.25278: getting the next task for host managed_node1 12613 1727096144.25285: done getting next task for host managed_node1 12613 1727096144.25287: ^ task is: TASK: Assert that the port1 profile is not activated 12613 1727096144.25289: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096144.25292: getting variables 12613 1727096144.25293: in VariableManager get_vars() 12613 1727096144.25350: Calling all_inventory to load vars for managed_node1 12613 1727096144.25355: Calling groups_inventory to load vars for managed_node1 12613 1727096144.25358: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096144.25473: Calling all_plugins_play to load vars for managed_node1 12613 1727096144.25478: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096144.25484: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000c1 12613 1727096144.25486: WORKER PROCESS EXITING 12613 1727096144.25490: Calling groups_plugins_play to load vars for managed_node1 12613 1727096144.25865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096144.26074: done with get_vars() 12613 1727096144.26085: done getting variables 12613 1727096144.26140: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port1 profile is not activated] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:118 Monday 23 September 2024 08:55:44 -0400 (0:00:00.058) 0:00:07.898 ****** 12613 1727096144.26170: entering _queue_task() for managed_node1/assert 12613 1727096144.26436: worker is 1 (out of 1 available) 12613 1727096144.26448: exiting _queue_task() for managed_node1/assert 12613 1727096144.26463: done queuing things up, now waiting for results queue to drain 12613 1727096144.26464: waiting for pending results... 12613 1727096144.26674: running TaskExecutor() for managed_node1/TASK: Assert that the port1 profile is not activated 12613 1727096144.26757: in run() - task 0afff68d-5257-a9dd-d073-0000000000c2 12613 1727096144.26769: variable 'ansible_search_path' from source: unknown 12613 1727096144.26803: calling self._execute() 12613 1727096144.26895: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096144.26902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096144.26908: variable 'omit' from source: magic vars 12613 1727096144.27339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096144.29566: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096144.29636: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096144.29670: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096144.29706: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096144.29732: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096144.29816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096144.29845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096144.29871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096144.29911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096144.29925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096144.30062: variable 'ansible_distribution' from source: facts 12613 1727096144.30069: variable 'ansible_distribution_major_version' from source: facts 12613 1727096144.30088: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096144.30091: when evaluation is False, skipping this task 12613 1727096144.30094: _execute() done 12613 1727096144.30096: dumping result to json 12613 1727096144.30099: done dumping result, returning 12613 1727096144.30106: done running TaskExecutor() for managed_node1/TASK: Assert that the port1 profile is not activated [0afff68d-5257-a9dd-d073-0000000000c2] 12613 1727096144.30110: sending task result for task 0afff68d-5257-a9dd-d073-0000000000c2 12613 1727096144.30206: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000c2 12613 1727096144.30209: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096144.30289: no more pending results, returning what we have 12613 1727096144.30292: results queue empty 12613 1727096144.30293: checking for any_errors_fatal 12613 1727096144.30297: done checking for any_errors_fatal 12613 1727096144.30298: checking for max_fail_percentage 12613 1727096144.30299: done checking for max_fail_percentage 12613 1727096144.30300: checking to see if all hosts have failed and the running result is not ok 12613 1727096144.30301: done checking to see if all hosts have failed 12613 1727096144.30302: getting the remaining hosts for this loop 12613 1727096144.30303: done getting the remaining hosts for this loop 12613 1727096144.30306: getting the next task for host managed_node1 12613 1727096144.30312: done getting next task for host managed_node1 12613 1727096144.30314: ^ task is: TASK: Assert that the port2 profile is not activated 12613 1727096144.30316: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096144.30319: getting variables 12613 1727096144.30321: in VariableManager get_vars() 12613 1727096144.30374: Calling all_inventory to load vars for managed_node1 12613 1727096144.30377: Calling groups_inventory to load vars for managed_node1 12613 1727096144.30379: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096144.30387: Calling all_plugins_play to load vars for managed_node1 12613 1727096144.30390: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096144.30392: Calling groups_plugins_play to load vars for managed_node1 12613 1727096144.30630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096144.30832: done with get_vars() 12613 1727096144.30842: done getting variables 12613 1727096144.30912: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port2 profile is not activated] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:125 Monday 23 September 2024 08:55:44 -0400 (0:00:00.047) 0:00:07.946 ****** 12613 1727096144.30939: entering _queue_task() for managed_node1/assert 12613 1727096144.31225: worker is 1 (out of 1 available) 12613 1727096144.31239: exiting _queue_task() for managed_node1/assert 12613 1727096144.31254: done queuing things up, now waiting for results queue to drain 12613 1727096144.31256: waiting for pending results... 12613 1727096144.31561: running TaskExecutor() for managed_node1/TASK: Assert that the port2 profile is not activated 12613 1727096144.31582: in run() - task 0afff68d-5257-a9dd-d073-0000000000c3 12613 1727096144.31595: variable 'ansible_search_path' from source: unknown 12613 1727096144.31631: calling self._execute() 12613 1727096144.31717: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096144.31721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096144.31765: variable 'omit' from source: magic vars 12613 1727096144.32146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096144.34397: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096144.34492: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096144.34511: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096144.34571: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096144.34575: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096144.34654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096144.34688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096144.34772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096144.34776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096144.34779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096144.34908: variable 'ansible_distribution' from source: facts 12613 1727096144.34911: variable 'ansible_distribution_major_version' from source: facts 12613 1727096144.34929: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096144.34932: when evaluation is False, skipping this task 12613 1727096144.34935: _execute() done 12613 1727096144.34938: dumping result to json 12613 1727096144.34940: done dumping result, returning 12613 1727096144.34947: done running TaskExecutor() for managed_node1/TASK: Assert that the port2 profile is not activated [0afff68d-5257-a9dd-d073-0000000000c3] 12613 1727096144.34950: sending task result for task 0afff68d-5257-a9dd-d073-0000000000c3 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096144.35148: no more pending results, returning what we have 12613 1727096144.35154: results queue empty 12613 1727096144.35155: checking for any_errors_fatal 12613 1727096144.35162: done checking for any_errors_fatal 12613 1727096144.35162: checking for max_fail_percentage 12613 1727096144.35164: done checking for max_fail_percentage 12613 1727096144.35165: checking to see if all hosts have failed and the running result is not ok 12613 1727096144.35165: done checking to see if all hosts have failed 12613 1727096144.35166: getting the remaining hosts for this loop 12613 1727096144.35170: done getting the remaining hosts for this loop 12613 1727096144.35173: getting the next task for host managed_node1 12613 1727096144.35179: done getting next task for host managed_node1 12613 1727096144.35182: ^ task is: TASK: Get the port1 device state 12613 1727096144.35184: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096144.35186: getting variables 12613 1727096144.35188: in VariableManager get_vars() 12613 1727096144.35237: Calling all_inventory to load vars for managed_node1 12613 1727096144.35240: Calling groups_inventory to load vars for managed_node1 12613 1727096144.35242: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096144.35251: Calling all_plugins_play to load vars for managed_node1 12613 1727096144.35256: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096144.35258: Calling groups_plugins_play to load vars for managed_node1 12613 1727096144.35440: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000c3 12613 1727096144.35444: WORKER PROCESS EXITING 12613 1727096144.35456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096144.35583: done with get_vars() 12613 1727096144.35592: done getting variables 12613 1727096144.35635: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the port1 device state] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:132 Monday 23 September 2024 08:55:44 -0400 (0:00:00.047) 0:00:07.993 ****** 12613 1727096144.35658: entering _queue_task() for managed_node1/command 12613 1727096144.35884: worker is 1 (out of 1 available) 12613 1727096144.35897: exiting _queue_task() for managed_node1/command 12613 1727096144.35910: done queuing things up, now waiting for results queue to drain 12613 1727096144.35911: waiting for pending results... 12613 1727096144.36080: running TaskExecutor() for managed_node1/TASK: Get the port1 device state 12613 1727096144.36135: in run() - task 0afff68d-5257-a9dd-d073-0000000000c4 12613 1727096144.36148: variable 'ansible_search_path' from source: unknown 12613 1727096144.36181: calling self._execute() 12613 1727096144.36247: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096144.36254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096144.36265: variable 'omit' from source: magic vars 12613 1727096144.36578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096144.38601: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096144.38645: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096144.38677: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096144.38703: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096144.38722: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096144.38786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096144.38806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096144.38823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096144.38849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096144.38860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096144.38958: variable 'ansible_distribution' from source: facts 12613 1727096144.38962: variable 'ansible_distribution_major_version' from source: facts 12613 1727096144.38980: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096144.38983: when evaluation is False, skipping this task 12613 1727096144.38987: _execute() done 12613 1727096144.38989: dumping result to json 12613 1727096144.38991: done dumping result, returning 12613 1727096144.38993: done running TaskExecutor() for managed_node1/TASK: Get the port1 device state [0afff68d-5257-a9dd-d073-0000000000c4] 12613 1727096144.39003: sending task result for task 0afff68d-5257-a9dd-d073-0000000000c4 12613 1727096144.39098: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000c4 12613 1727096144.39100: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096144.39254: no more pending results, returning what we have 12613 1727096144.39258: results queue empty 12613 1727096144.39259: checking for any_errors_fatal 12613 1727096144.39265: done checking for any_errors_fatal 12613 1727096144.39266: checking for max_fail_percentage 12613 1727096144.39270: done checking for max_fail_percentage 12613 1727096144.39271: checking to see if all hosts have failed and the running result is not ok 12613 1727096144.39272: done checking to see if all hosts have failed 12613 1727096144.39272: getting the remaining hosts for this loop 12613 1727096144.39273: done getting the remaining hosts for this loop 12613 1727096144.39276: getting the next task for host managed_node1 12613 1727096144.39281: done getting next task for host managed_node1 12613 1727096144.39284: ^ task is: TASK: Get the port2 device state 12613 1727096144.39285: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096144.39288: getting variables 12613 1727096144.39289: in VariableManager get_vars() 12613 1727096144.39336: Calling all_inventory to load vars for managed_node1 12613 1727096144.39344: Calling groups_inventory to load vars for managed_node1 12613 1727096144.39347: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096144.39360: Calling all_plugins_play to load vars for managed_node1 12613 1727096144.39382: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096144.39386: Calling groups_plugins_play to load vars for managed_node1 12613 1727096144.39646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096144.39862: done with get_vars() 12613 1727096144.39875: done getting variables 12613 1727096144.39939: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the port2 device state] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:139 Monday 23 September 2024 08:55:44 -0400 (0:00:00.043) 0:00:08.036 ****** 12613 1727096144.39970: entering _queue_task() for managed_node1/command 12613 1727096144.40307: worker is 1 (out of 1 available) 12613 1727096144.40320: exiting _queue_task() for managed_node1/command 12613 1727096144.40371: done queuing things up, now waiting for results queue to drain 12613 1727096144.40372: waiting for pending results... 12613 1727096144.40820: running TaskExecutor() for managed_node1/TASK: Get the port2 device state 12613 1727096144.40824: in run() - task 0afff68d-5257-a9dd-d073-0000000000c5 12613 1727096144.40828: variable 'ansible_search_path' from source: unknown 12613 1727096144.40831: calling self._execute() 12613 1727096144.40834: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096144.40836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096144.40839: variable 'omit' from source: magic vars 12613 1727096144.41264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096144.43340: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096144.43477: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096144.43508: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096144.43543: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096144.43571: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096144.43661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096144.43710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096144.43757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096144.43816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096144.43845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096144.44022: variable 'ansible_distribution' from source: facts 12613 1727096144.44029: variable 'ansible_distribution_major_version' from source: facts 12613 1727096144.44033: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096144.44035: when evaluation is False, skipping this task 12613 1727096144.44038: _execute() done 12613 1727096144.44040: dumping result to json 12613 1727096144.44042: done dumping result, returning 12613 1727096144.44044: done running TaskExecutor() for managed_node1/TASK: Get the port2 device state [0afff68d-5257-a9dd-d073-0000000000c5] 12613 1727096144.44046: sending task result for task 0afff68d-5257-a9dd-d073-0000000000c5 12613 1727096144.44111: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000c5 12613 1727096144.44113: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096144.44182: no more pending results, returning what we have 12613 1727096144.44185: results queue empty 12613 1727096144.44186: checking for any_errors_fatal 12613 1727096144.44192: done checking for any_errors_fatal 12613 1727096144.44192: checking for max_fail_percentage 12613 1727096144.44194: done checking for max_fail_percentage 12613 1727096144.44195: checking to see if all hosts have failed and the running result is not ok 12613 1727096144.44197: done checking to see if all hosts have failed 12613 1727096144.44197: getting the remaining hosts for this loop 12613 1727096144.44198: done getting the remaining hosts for this loop 12613 1727096144.44202: getting the next task for host managed_node1 12613 1727096144.44208: done getting next task for host managed_node1 12613 1727096144.44211: ^ task is: TASK: Assert that the port1 device is in DOWN state 12613 1727096144.44213: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096144.44216: getting variables 12613 1727096144.44217: in VariableManager get_vars() 12613 1727096144.44270: Calling all_inventory to load vars for managed_node1 12613 1727096144.44273: Calling groups_inventory to load vars for managed_node1 12613 1727096144.44275: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096144.44284: Calling all_plugins_play to load vars for managed_node1 12613 1727096144.44287: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096144.44289: Calling groups_plugins_play to load vars for managed_node1 12613 1727096144.44513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096144.44722: done with get_vars() 12613 1727096144.44732: done getting variables 12613 1727096144.44807: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port1 device is in DOWN state] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:146 Monday 23 September 2024 08:55:44 -0400 (0:00:00.048) 0:00:08.085 ****** 12613 1727096144.44834: entering _queue_task() for managed_node1/assert 12613 1727096144.45103: worker is 1 (out of 1 available) 12613 1727096144.45115: exiting _queue_task() for managed_node1/assert 12613 1727096144.45131: done queuing things up, now waiting for results queue to drain 12613 1727096144.45133: waiting for pending results... 12613 1727096144.45944: running TaskExecutor() for managed_node1/TASK: Assert that the port1 device is in DOWN state 12613 1727096144.45949: in run() - task 0afff68d-5257-a9dd-d073-0000000000c6 12613 1727096144.45955: variable 'ansible_search_path' from source: unknown 12613 1727096144.45958: calling self._execute() 12613 1727096144.46057: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096144.46070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096144.46086: variable 'omit' from source: magic vars 12613 1727096144.46662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096144.49171: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096144.49456: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096144.49460: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096144.49463: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096144.49528: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096144.49626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096144.49671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096144.49708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096144.49757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096144.49795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096144.49996: variable 'ansible_distribution' from source: facts 12613 1727096144.50002: variable 'ansible_distribution_major_version' from source: facts 12613 1727096144.50007: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096144.50010: when evaluation is False, skipping this task 12613 1727096144.50012: _execute() done 12613 1727096144.50015: dumping result to json 12613 1727096144.50017: done dumping result, returning 12613 1727096144.50026: done running TaskExecutor() for managed_node1/TASK: Assert that the port1 device is in DOWN state [0afff68d-5257-a9dd-d073-0000000000c6] 12613 1727096144.50036: sending task result for task 0afff68d-5257-a9dd-d073-0000000000c6 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096144.50276: no more pending results, returning what we have 12613 1727096144.50280: results queue empty 12613 1727096144.50281: checking for any_errors_fatal 12613 1727096144.50285: done checking for any_errors_fatal 12613 1727096144.50286: checking for max_fail_percentage 12613 1727096144.50288: done checking for max_fail_percentage 12613 1727096144.50289: checking to see if all hosts have failed and the running result is not ok 12613 1727096144.50289: done checking to see if all hosts have failed 12613 1727096144.50290: getting the remaining hosts for this loop 12613 1727096144.50292: done getting the remaining hosts for this loop 12613 1727096144.50296: getting the next task for host managed_node1 12613 1727096144.50302: done getting next task for host managed_node1 12613 1727096144.50305: ^ task is: TASK: Assert that the port2 device is in DOWN state 12613 1727096144.50307: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096144.50310: getting variables 12613 1727096144.50312: in VariableManager get_vars() 12613 1727096144.50620: Calling all_inventory to load vars for managed_node1 12613 1727096144.50623: Calling groups_inventory to load vars for managed_node1 12613 1727096144.50626: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096144.50637: Calling all_plugins_play to load vars for managed_node1 12613 1727096144.50640: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096144.50643: Calling groups_plugins_play to load vars for managed_node1 12613 1727096144.50965: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000c6 12613 1727096144.50970: WORKER PROCESS EXITING 12613 1727096144.50993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096144.51196: done with get_vars() 12613 1727096144.51207: done getting variables 12613 1727096144.51277: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port2 device is in DOWN state] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:153 Monday 23 September 2024 08:55:44 -0400 (0:00:00.064) 0:00:08.150 ****** 12613 1727096144.51305: entering _queue_task() for managed_node1/assert 12613 1727096144.51703: worker is 1 (out of 1 available) 12613 1727096144.51713: exiting _queue_task() for managed_node1/assert 12613 1727096144.51724: done queuing things up, now waiting for results queue to drain 12613 1727096144.51725: waiting for pending results... 12613 1727096144.51924: running TaskExecutor() for managed_node1/TASK: Assert that the port2 device is in DOWN state 12613 1727096144.52037: in run() - task 0afff68d-5257-a9dd-d073-0000000000c7 12613 1727096144.52065: variable 'ansible_search_path' from source: unknown 12613 1727096144.52116: calling self._execute() 12613 1727096144.52243: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096144.52257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096144.52274: variable 'omit' from source: magic vars 12613 1727096144.52713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096144.55502: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096144.55594: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096144.55628: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096144.55701: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096144.55704: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096144.55785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096144.55828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096144.55973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096144.55976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096144.55978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096144.56047: variable 'ansible_distribution' from source: facts 12613 1727096144.56061: variable 'ansible_distribution_major_version' from source: facts 12613 1727096144.56084: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096144.56098: when evaluation is False, skipping this task 12613 1727096144.56104: _execute() done 12613 1727096144.56111: dumping result to json 12613 1727096144.56118: done dumping result, returning 12613 1727096144.56128: done running TaskExecutor() for managed_node1/TASK: Assert that the port2 device is in DOWN state [0afff68d-5257-a9dd-d073-0000000000c7] 12613 1727096144.56136: sending task result for task 0afff68d-5257-a9dd-d073-0000000000c7 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096144.56366: no more pending results, returning what we have 12613 1727096144.56371: results queue empty 12613 1727096144.56373: checking for any_errors_fatal 12613 1727096144.56380: done checking for any_errors_fatal 12613 1727096144.56381: checking for max_fail_percentage 12613 1727096144.56383: done checking for max_fail_percentage 12613 1727096144.56384: checking to see if all hosts have failed and the running result is not ok 12613 1727096144.56384: done checking to see if all hosts have failed 12613 1727096144.56385: getting the remaining hosts for this loop 12613 1727096144.56387: done getting the remaining hosts for this loop 12613 1727096144.56390: getting the next task for host managed_node1 12613 1727096144.56399: done getting next task for host managed_node1 12613 1727096144.56405: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12613 1727096144.56408: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096144.56428: getting variables 12613 1727096144.56430: in VariableManager get_vars() 12613 1727096144.56489: Calling all_inventory to load vars for managed_node1 12613 1727096144.56492: Calling groups_inventory to load vars for managed_node1 12613 1727096144.56495: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096144.56505: Calling all_plugins_play to load vars for managed_node1 12613 1727096144.56508: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096144.56511: Calling groups_plugins_play to load vars for managed_node1 12613 1727096144.56947: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000c7 12613 1727096144.56950: WORKER PROCESS EXITING 12613 1727096144.56981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096144.57211: done with get_vars() 12613 1727096144.57222: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:55:44 -0400 (0:00:00.060) 0:00:08.210 ****** 12613 1727096144.57322: entering _queue_task() for managed_node1/include_tasks 12613 1727096144.57606: worker is 1 (out of 1 available) 12613 1727096144.57618: exiting _queue_task() for managed_node1/include_tasks 12613 1727096144.57630: done queuing things up, now waiting for results queue to drain 12613 1727096144.57631: waiting for pending results... 12613 1727096144.57923: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12613 1727096144.58056: in run() - task 0afff68d-5257-a9dd-d073-0000000000cf 12613 1727096144.58077: variable 'ansible_search_path' from source: unknown 12613 1727096144.58084: variable 'ansible_search_path' from source: unknown 12613 1727096144.58128: calling self._execute() 12613 1727096144.58227: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096144.58239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096144.58257: variable 'omit' from source: magic vars 12613 1727096144.58803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096144.61301: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096144.61574: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096144.61577: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096144.61580: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096144.61582: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096144.61584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096144.61618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096144.61647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096144.61701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096144.61723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096144.61870: variable 'ansible_distribution' from source: facts 12613 1727096144.61882: variable 'ansible_distribution_major_version' from source: facts 12613 1727096144.61903: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096144.61915: when evaluation is False, skipping this task 12613 1727096144.61925: _execute() done 12613 1727096144.61931: dumping result to json 12613 1727096144.61938: done dumping result, returning 12613 1727096144.61949: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-a9dd-d073-0000000000cf] 12613 1727096144.61961: sending task result for task 0afff68d-5257-a9dd-d073-0000000000cf skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096144.62225: no more pending results, returning what we have 12613 1727096144.62230: results queue empty 12613 1727096144.62231: checking for any_errors_fatal 12613 1727096144.62240: done checking for any_errors_fatal 12613 1727096144.62241: checking for max_fail_percentage 12613 1727096144.62243: done checking for max_fail_percentage 12613 1727096144.62244: checking to see if all hosts have failed and the running result is not ok 12613 1727096144.62244: done checking to see if all hosts have failed 12613 1727096144.62245: getting the remaining hosts for this loop 12613 1727096144.62247: done getting the remaining hosts for this loop 12613 1727096144.62251: getting the next task for host managed_node1 12613 1727096144.62260: done getting next task for host managed_node1 12613 1727096144.62265: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12613 1727096144.62269: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096144.62287: getting variables 12613 1727096144.62289: in VariableManager get_vars() 12613 1727096144.62344: Calling all_inventory to load vars for managed_node1 12613 1727096144.62346: Calling groups_inventory to load vars for managed_node1 12613 1727096144.62349: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096144.62362: Calling all_plugins_play to load vars for managed_node1 12613 1727096144.62365: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096144.62474: Calling groups_plugins_play to load vars for managed_node1 12613 1727096144.62823: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000cf 12613 1727096144.62826: WORKER PROCESS EXITING 12613 1727096144.62848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096144.63075: done with get_vars() 12613 1727096144.63086: done getting variables 12613 1727096144.63149: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:55:44 -0400 (0:00:00.058) 0:00:08.269 ****** 12613 1727096144.63186: entering _queue_task() for managed_node1/debug 12613 1727096144.63478: worker is 1 (out of 1 available) 12613 1727096144.63493: exiting _queue_task() for managed_node1/debug 12613 1727096144.63506: done queuing things up, now waiting for results queue to drain 12613 1727096144.63507: waiting for pending results... 12613 1727096144.63739: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 12613 1727096144.63843: in run() - task 0afff68d-5257-a9dd-d073-0000000000d0 12613 1727096144.63857: variable 'ansible_search_path' from source: unknown 12613 1727096144.63861: variable 'ansible_search_path' from source: unknown 12613 1727096144.63896: calling self._execute() 12613 1727096144.63984: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096144.63988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096144.64002: variable 'omit' from source: magic vars 12613 1727096144.64389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096144.67298: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096144.67376: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096144.67421: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096144.67462: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096144.67500: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096144.67587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096144.67628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096144.67660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096144.67711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096144.67773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096144.67877: variable 'ansible_distribution' from source: facts 12613 1727096144.67888: variable 'ansible_distribution_major_version' from source: facts 12613 1727096144.67909: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096144.67920: when evaluation is False, skipping this task 12613 1727096144.67928: _execute() done 12613 1727096144.67935: dumping result to json 12613 1727096144.67941: done dumping result, returning 12613 1727096144.67955: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-a9dd-d073-0000000000d0] 12613 1727096144.67965: sending task result for task 0afff68d-5257-a9dd-d073-0000000000d0 12613 1727096144.68214: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000d0 12613 1727096144.68217: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096144.68281: no more pending results, returning what we have 12613 1727096144.68285: results queue empty 12613 1727096144.68287: checking for any_errors_fatal 12613 1727096144.68293: done checking for any_errors_fatal 12613 1727096144.68293: checking for max_fail_percentage 12613 1727096144.68296: done checking for max_fail_percentage 12613 1727096144.68297: checking to see if all hosts have failed and the running result is not ok 12613 1727096144.68298: done checking to see if all hosts have failed 12613 1727096144.68298: getting the remaining hosts for this loop 12613 1727096144.68300: done getting the remaining hosts for this loop 12613 1727096144.68304: getting the next task for host managed_node1 12613 1727096144.68311: done getting next task for host managed_node1 12613 1727096144.68315: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12613 1727096144.68318: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096144.68336: getting variables 12613 1727096144.68338: in VariableManager get_vars() 12613 1727096144.68509: Calling all_inventory to load vars for managed_node1 12613 1727096144.68512: Calling groups_inventory to load vars for managed_node1 12613 1727096144.68514: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096144.68522: Calling all_plugins_play to load vars for managed_node1 12613 1727096144.68524: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096144.68526: Calling groups_plugins_play to load vars for managed_node1 12613 1727096144.69004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096144.69462: done with get_vars() 12613 1727096144.69477: done getting variables 12613 1727096144.69703: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:55:44 -0400 (0:00:00.065) 0:00:08.334 ****** 12613 1727096144.69738: entering _queue_task() for managed_node1/fail 12613 1727096144.70385: worker is 1 (out of 1 available) 12613 1727096144.70398: exiting _queue_task() for managed_node1/fail 12613 1727096144.70421: done queuing things up, now waiting for results queue to drain 12613 1727096144.70423: waiting for pending results... 12613 1727096144.70715: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12613 1727096144.70858: in run() - task 0afff68d-5257-a9dd-d073-0000000000d1 12613 1727096144.70889: variable 'ansible_search_path' from source: unknown 12613 1727096144.70898: variable 'ansible_search_path' from source: unknown 12613 1727096144.70941: calling self._execute() 12613 1727096144.71040: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096144.71085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096144.71093: variable 'omit' from source: magic vars 12613 1727096144.71560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096144.75910: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096144.76061: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096144.76065: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096144.76093: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096144.76127: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096144.76220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096144.76257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096144.76291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096144.76339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096144.76361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096144.76574: variable 'ansible_distribution' from source: facts 12613 1727096144.76600: variable 'ansible_distribution_major_version' from source: facts 12613 1727096144.76612: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096144.76620: when evaluation is False, skipping this task 12613 1727096144.76633: _execute() done 12613 1727096144.76673: dumping result to json 12613 1727096144.76677: done dumping result, returning 12613 1727096144.76680: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-a9dd-d073-0000000000d1] 12613 1727096144.76682: sending task result for task 0afff68d-5257-a9dd-d073-0000000000d1 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096144.76979: no more pending results, returning what we have 12613 1727096144.76983: results queue empty 12613 1727096144.76984: checking for any_errors_fatal 12613 1727096144.76993: done checking for any_errors_fatal 12613 1727096144.76994: checking for max_fail_percentage 12613 1727096144.76996: done checking for max_fail_percentage 12613 1727096144.76996: checking to see if all hosts have failed and the running result is not ok 12613 1727096144.76997: done checking to see if all hosts have failed 12613 1727096144.76998: getting the remaining hosts for this loop 12613 1727096144.76999: done getting the remaining hosts for this loop 12613 1727096144.77003: getting the next task for host managed_node1 12613 1727096144.77010: done getting next task for host managed_node1 12613 1727096144.77015: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12613 1727096144.77018: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096144.77038: getting variables 12613 1727096144.77040: in VariableManager get_vars() 12613 1727096144.77210: Calling all_inventory to load vars for managed_node1 12613 1727096144.77213: Calling groups_inventory to load vars for managed_node1 12613 1727096144.77215: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096144.77226: Calling all_plugins_play to load vars for managed_node1 12613 1727096144.77229: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096144.77232: Calling groups_plugins_play to load vars for managed_node1 12613 1727096144.77565: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000d1 12613 1727096144.77571: WORKER PROCESS EXITING 12613 1727096144.77594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096144.77824: done with get_vars() 12613 1727096144.77837: done getting variables 12613 1727096144.77903: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:55:44 -0400 (0:00:00.081) 0:00:08.416 ****** 12613 1727096144.77939: entering _queue_task() for managed_node1/fail 12613 1727096144.78215: worker is 1 (out of 1 available) 12613 1727096144.78228: exiting _queue_task() for managed_node1/fail 12613 1727096144.78239: done queuing things up, now waiting for results queue to drain 12613 1727096144.78240: waiting for pending results... 12613 1727096144.78589: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12613 1727096144.78755: in run() - task 0afff68d-5257-a9dd-d073-0000000000d2 12613 1727096144.78780: variable 'ansible_search_path' from source: unknown 12613 1727096144.78793: variable 'ansible_search_path' from source: unknown 12613 1727096144.78889: calling self._execute() 12613 1727096144.78941: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096144.78958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096144.78980: variable 'omit' from source: magic vars 12613 1727096144.79455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096144.81959: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096144.82042: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096144.82092: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096144.82132: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096144.82170: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096144.82308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096144.82312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096144.82344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096144.82418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096144.82445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096144.82645: variable 'ansible_distribution' from source: facts 12613 1727096144.82650: variable 'ansible_distribution_major_version' from source: facts 12613 1727096144.82698: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096144.82701: when evaluation is False, skipping this task 12613 1727096144.82703: _execute() done 12613 1727096144.82706: dumping result to json 12613 1727096144.82708: done dumping result, returning 12613 1727096144.82710: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-a9dd-d073-0000000000d2] 12613 1727096144.82749: sending task result for task 0afff68d-5257-a9dd-d073-0000000000d2 12613 1727096144.82881: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000d2 12613 1727096144.82884: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096144.83032: no more pending results, returning what we have 12613 1727096144.83037: results queue empty 12613 1727096144.83038: checking for any_errors_fatal 12613 1727096144.83045: done checking for any_errors_fatal 12613 1727096144.83046: checking for max_fail_percentage 12613 1727096144.83048: done checking for max_fail_percentage 12613 1727096144.83049: checking to see if all hosts have failed and the running result is not ok 12613 1727096144.83050: done checking to see if all hosts have failed 12613 1727096144.83050: getting the remaining hosts for this loop 12613 1727096144.83054: done getting the remaining hosts for this loop 12613 1727096144.83059: getting the next task for host managed_node1 12613 1727096144.83066: done getting next task for host managed_node1 12613 1727096144.83276: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12613 1727096144.83279: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096144.83296: getting variables 12613 1727096144.83298: in VariableManager get_vars() 12613 1727096144.83347: Calling all_inventory to load vars for managed_node1 12613 1727096144.83350: Calling groups_inventory to load vars for managed_node1 12613 1727096144.83355: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096144.83364: Calling all_plugins_play to load vars for managed_node1 12613 1727096144.83474: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096144.83480: Calling groups_plugins_play to load vars for managed_node1 12613 1727096144.83789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096144.84301: done with get_vars() 12613 1727096144.84318: done getting variables 12613 1727096144.84402: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:55:44 -0400 (0:00:00.064) 0:00:08.481 ****** 12613 1727096144.84440: entering _queue_task() for managed_node1/fail 12613 1727096144.84790: worker is 1 (out of 1 available) 12613 1727096144.84803: exiting _queue_task() for managed_node1/fail 12613 1727096144.84815: done queuing things up, now waiting for results queue to drain 12613 1727096144.84816: waiting for pending results... 12613 1727096144.85011: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12613 1727096144.85093: in run() - task 0afff68d-5257-a9dd-d073-0000000000d3 12613 1727096144.85103: variable 'ansible_search_path' from source: unknown 12613 1727096144.85107: variable 'ansible_search_path' from source: unknown 12613 1727096144.85136: calling self._execute() 12613 1727096144.85207: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096144.85210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096144.85221: variable 'omit' from source: magic vars 12613 1727096144.85546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096144.88391: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096144.88447: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096144.88479: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096144.88508: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096144.88526: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096144.88591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096144.88615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096144.88633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096144.88661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096144.88673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096144.88773: variable 'ansible_distribution' from source: facts 12613 1727096144.88783: variable 'ansible_distribution_major_version' from source: facts 12613 1727096144.88795: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096144.88798: when evaluation is False, skipping this task 12613 1727096144.88801: _execute() done 12613 1727096144.88804: dumping result to json 12613 1727096144.88806: done dumping result, returning 12613 1727096144.88817: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-a9dd-d073-0000000000d3] 12613 1727096144.88820: sending task result for task 0afff68d-5257-a9dd-d073-0000000000d3 12613 1727096144.88910: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000d3 12613 1727096144.88913: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096144.88986: no more pending results, returning what we have 12613 1727096144.88989: results queue empty 12613 1727096144.88990: checking for any_errors_fatal 12613 1727096144.88997: done checking for any_errors_fatal 12613 1727096144.88998: checking for max_fail_percentage 12613 1727096144.89000: done checking for max_fail_percentage 12613 1727096144.89001: checking to see if all hosts have failed and the running result is not ok 12613 1727096144.89001: done checking to see if all hosts have failed 12613 1727096144.89002: getting the remaining hosts for this loop 12613 1727096144.89003: done getting the remaining hosts for this loop 12613 1727096144.89007: getting the next task for host managed_node1 12613 1727096144.89013: done getting next task for host managed_node1 12613 1727096144.89017: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12613 1727096144.89020: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096144.89040: getting variables 12613 1727096144.89042: in VariableManager get_vars() 12613 1727096144.89091: Calling all_inventory to load vars for managed_node1 12613 1727096144.89093: Calling groups_inventory to load vars for managed_node1 12613 1727096144.89095: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096144.89103: Calling all_plugins_play to load vars for managed_node1 12613 1727096144.89105: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096144.89108: Calling groups_plugins_play to load vars for managed_node1 12613 1727096144.89239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096144.89362: done with get_vars() 12613 1727096144.89372: done getting variables 12613 1727096144.89414: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:55:44 -0400 (0:00:00.049) 0:00:08.531 ****** 12613 1727096144.89437: entering _queue_task() for managed_node1/dnf 12613 1727096144.89651: worker is 1 (out of 1 available) 12613 1727096144.89665: exiting _queue_task() for managed_node1/dnf 12613 1727096144.89679: done queuing things up, now waiting for results queue to drain 12613 1727096144.89680: waiting for pending results... 12613 1727096144.89974: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12613 1727096144.90157: in run() - task 0afff68d-5257-a9dd-d073-0000000000d4 12613 1727096144.90161: variable 'ansible_search_path' from source: unknown 12613 1727096144.90164: variable 'ansible_search_path' from source: unknown 12613 1727096144.90166: calling self._execute() 12613 1727096144.90346: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096144.90400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096144.90687: variable 'omit' from source: magic vars 12613 1727096144.91144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096144.93017: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096144.93066: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096144.93098: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096144.93124: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096144.93142: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096144.93209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096144.93228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096144.93245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096144.93275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096144.93286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096144.93384: variable 'ansible_distribution' from source: facts 12613 1727096144.93387: variable 'ansible_distribution_major_version' from source: facts 12613 1727096144.93404: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096144.93408: when evaluation is False, skipping this task 12613 1727096144.93411: _execute() done 12613 1727096144.93413: dumping result to json 12613 1727096144.93415: done dumping result, returning 12613 1727096144.93425: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-a9dd-d073-0000000000d4] 12613 1727096144.93428: sending task result for task 0afff68d-5257-a9dd-d073-0000000000d4 12613 1727096144.93525: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000d4 12613 1727096144.93527: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096144.93610: no more pending results, returning what we have 12613 1727096144.93613: results queue empty 12613 1727096144.93614: checking for any_errors_fatal 12613 1727096144.93621: done checking for any_errors_fatal 12613 1727096144.93622: checking for max_fail_percentage 12613 1727096144.93624: done checking for max_fail_percentage 12613 1727096144.93624: checking to see if all hosts have failed and the running result is not ok 12613 1727096144.93625: done checking to see if all hosts have failed 12613 1727096144.93626: getting the remaining hosts for this loop 12613 1727096144.93628: done getting the remaining hosts for this loop 12613 1727096144.93632: getting the next task for host managed_node1 12613 1727096144.93638: done getting next task for host managed_node1 12613 1727096144.93692: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12613 1727096144.93696: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096144.93712: getting variables 12613 1727096144.93713: in VariableManager get_vars() 12613 1727096144.93761: Calling all_inventory to load vars for managed_node1 12613 1727096144.93764: Calling groups_inventory to load vars for managed_node1 12613 1727096144.93766: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096144.93787: Calling all_plugins_play to load vars for managed_node1 12613 1727096144.93791: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096144.93797: Calling groups_plugins_play to load vars for managed_node1 12613 1727096144.94037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096144.94254: done with get_vars() 12613 1727096144.94265: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12613 1727096144.94343: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:55:44 -0400 (0:00:00.049) 0:00:08.580 ****** 12613 1727096144.94377: entering _queue_task() for managed_node1/yum 12613 1727096144.94671: worker is 1 (out of 1 available) 12613 1727096144.94686: exiting _queue_task() for managed_node1/yum 12613 1727096144.94697: done queuing things up, now waiting for results queue to drain 12613 1727096144.94699: waiting for pending results... 12613 1727096144.94999: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12613 1727096144.95078: in run() - task 0afff68d-5257-a9dd-d073-0000000000d5 12613 1727096144.95082: variable 'ansible_search_path' from source: unknown 12613 1727096144.95085: variable 'ansible_search_path' from source: unknown 12613 1727096144.95102: calling self._execute() 12613 1727096144.95194: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096144.95198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096144.95256: variable 'omit' from source: magic vars 12613 1727096144.95653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096144.97272: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096144.97326: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096144.97353: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096144.97381: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096144.97401: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096144.97465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096144.97486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096144.97505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096144.97533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096144.97543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096144.97643: variable 'ansible_distribution' from source: facts 12613 1727096144.97646: variable 'ansible_distribution_major_version' from source: facts 12613 1727096144.97664: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096144.97668: when evaluation is False, skipping this task 12613 1727096144.97671: _execute() done 12613 1727096144.97673: dumping result to json 12613 1727096144.97676: done dumping result, returning 12613 1727096144.97683: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-a9dd-d073-0000000000d5] 12613 1727096144.97688: sending task result for task 0afff68d-5257-a9dd-d073-0000000000d5 12613 1727096144.97783: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000d5 12613 1727096144.97785: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096144.97832: no more pending results, returning what we have 12613 1727096144.97835: results queue empty 12613 1727096144.97836: checking for any_errors_fatal 12613 1727096144.97844: done checking for any_errors_fatal 12613 1727096144.97845: checking for max_fail_percentage 12613 1727096144.97847: done checking for max_fail_percentage 12613 1727096144.97847: checking to see if all hosts have failed and the running result is not ok 12613 1727096144.97848: done checking to see if all hosts have failed 12613 1727096144.97849: getting the remaining hosts for this loop 12613 1727096144.97850: done getting the remaining hosts for this loop 12613 1727096144.97854: getting the next task for host managed_node1 12613 1727096144.97860: done getting next task for host managed_node1 12613 1727096144.97864: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12613 1727096144.97866: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096144.97886: getting variables 12613 1727096144.97888: in VariableManager get_vars() 12613 1727096144.97938: Calling all_inventory to load vars for managed_node1 12613 1727096144.97941: Calling groups_inventory to load vars for managed_node1 12613 1727096144.97943: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096144.97952: Calling all_plugins_play to load vars for managed_node1 12613 1727096144.97954: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096144.97957: Calling groups_plugins_play to load vars for managed_node1 12613 1727096144.98208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096144.98418: done with get_vars() 12613 1727096144.98429: done getting variables 12613 1727096144.98490: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:55:44 -0400 (0:00:00.041) 0:00:08.622 ****** 12613 1727096144.98522: entering _queue_task() for managed_node1/fail 12613 1727096144.98824: worker is 1 (out of 1 available) 12613 1727096144.98835: exiting _queue_task() for managed_node1/fail 12613 1727096144.98847: done queuing things up, now waiting for results queue to drain 12613 1727096144.98848: waiting for pending results... 12613 1727096144.99144: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12613 1727096144.99233: in run() - task 0afff68d-5257-a9dd-d073-0000000000d6 12613 1727096144.99242: variable 'ansible_search_path' from source: unknown 12613 1727096144.99250: variable 'ansible_search_path' from source: unknown 12613 1727096144.99284: calling self._execute() 12613 1727096144.99357: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096144.99361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096144.99371: variable 'omit' from source: magic vars 12613 1727096144.99689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096145.01269: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096145.01314: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096145.01342: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096145.01372: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096145.01392: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096145.01455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096145.01479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096145.01497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096145.01523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096145.01533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096145.01634: variable 'ansible_distribution' from source: facts 12613 1727096145.01637: variable 'ansible_distribution_major_version' from source: facts 12613 1727096145.01656: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096145.01660: when evaluation is False, skipping this task 12613 1727096145.01662: _execute() done 12613 1727096145.01664: dumping result to json 12613 1727096145.01666: done dumping result, returning 12613 1727096145.01674: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-a9dd-d073-0000000000d6] 12613 1727096145.01680: sending task result for task 0afff68d-5257-a9dd-d073-0000000000d6 12613 1727096145.01777: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000d6 12613 1727096145.01779: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096145.01832: no more pending results, returning what we have 12613 1727096145.01836: results queue empty 12613 1727096145.01837: checking for any_errors_fatal 12613 1727096145.01843: done checking for any_errors_fatal 12613 1727096145.01843: checking for max_fail_percentage 12613 1727096145.01845: done checking for max_fail_percentage 12613 1727096145.01846: checking to see if all hosts have failed and the running result is not ok 12613 1727096145.01847: done checking to see if all hosts have failed 12613 1727096145.01847: getting the remaining hosts for this loop 12613 1727096145.01848: done getting the remaining hosts for this loop 12613 1727096145.01854: getting the next task for host managed_node1 12613 1727096145.01860: done getting next task for host managed_node1 12613 1727096145.01864: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12613 1727096145.01868: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096145.01887: getting variables 12613 1727096145.01888: in VariableManager get_vars() 12613 1727096145.01940: Calling all_inventory to load vars for managed_node1 12613 1727096145.01942: Calling groups_inventory to load vars for managed_node1 12613 1727096145.01944: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096145.01955: Calling all_plugins_play to load vars for managed_node1 12613 1727096145.01958: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096145.01960: Calling groups_plugins_play to load vars for managed_node1 12613 1727096145.02141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096145.02271: done with get_vars() 12613 1727096145.02280: done getting variables 12613 1727096145.02324: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:55:45 -0400 (0:00:00.038) 0:00:08.660 ****** 12613 1727096145.02348: entering _queue_task() for managed_node1/package 12613 1727096145.02582: worker is 1 (out of 1 available) 12613 1727096145.02595: exiting _queue_task() for managed_node1/package 12613 1727096145.02607: done queuing things up, now waiting for results queue to drain 12613 1727096145.02608: waiting for pending results... 12613 1727096145.02782: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 12613 1727096145.02870: in run() - task 0afff68d-5257-a9dd-d073-0000000000d7 12613 1727096145.02893: variable 'ansible_search_path' from source: unknown 12613 1727096145.02897: variable 'ansible_search_path' from source: unknown 12613 1727096145.02927: calling self._execute() 12613 1727096145.03004: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096145.03007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096145.03018: variable 'omit' from source: magic vars 12613 1727096145.03343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096145.04918: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096145.05194: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096145.05223: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096145.05249: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096145.05272: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096145.05335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096145.05360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096145.05381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096145.05407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096145.05417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096145.05520: variable 'ansible_distribution' from source: facts 12613 1727096145.05523: variable 'ansible_distribution_major_version' from source: facts 12613 1727096145.05539: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096145.05542: when evaluation is False, skipping this task 12613 1727096145.05545: _execute() done 12613 1727096145.05547: dumping result to json 12613 1727096145.05550: done dumping result, returning 12613 1727096145.05565: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-a9dd-d073-0000000000d7] 12613 1727096145.05570: sending task result for task 0afff68d-5257-a9dd-d073-0000000000d7 12613 1727096145.05655: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000d7 12613 1727096145.05658: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096145.05716: no more pending results, returning what we have 12613 1727096145.05719: results queue empty 12613 1727096145.05720: checking for any_errors_fatal 12613 1727096145.05726: done checking for any_errors_fatal 12613 1727096145.05726: checking for max_fail_percentage 12613 1727096145.05728: done checking for max_fail_percentage 12613 1727096145.05729: checking to see if all hosts have failed and the running result is not ok 12613 1727096145.05730: done checking to see if all hosts have failed 12613 1727096145.05730: getting the remaining hosts for this loop 12613 1727096145.05732: done getting the remaining hosts for this loop 12613 1727096145.05735: getting the next task for host managed_node1 12613 1727096145.05741: done getting next task for host managed_node1 12613 1727096145.05745: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12613 1727096145.05747: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096145.05765: getting variables 12613 1727096145.05766: in VariableManager get_vars() 12613 1727096145.05819: Calling all_inventory to load vars for managed_node1 12613 1727096145.05821: Calling groups_inventory to load vars for managed_node1 12613 1727096145.05823: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096145.05833: Calling all_plugins_play to load vars for managed_node1 12613 1727096145.05836: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096145.05839: Calling groups_plugins_play to load vars for managed_node1 12613 1727096145.05989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096145.06112: done with get_vars() 12613 1727096145.06121: done getting variables 12613 1727096145.06163: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:55:45 -0400 (0:00:00.038) 0:00:08.699 ****** 12613 1727096145.06188: entering _queue_task() for managed_node1/package 12613 1727096145.06413: worker is 1 (out of 1 available) 12613 1727096145.06427: exiting _queue_task() for managed_node1/package 12613 1727096145.06438: done queuing things up, now waiting for results queue to drain 12613 1727096145.06440: waiting for pending results... 12613 1727096145.06617: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12613 1727096145.06712: in run() - task 0afff68d-5257-a9dd-d073-0000000000d8 12613 1727096145.06724: variable 'ansible_search_path' from source: unknown 12613 1727096145.06727: variable 'ansible_search_path' from source: unknown 12613 1727096145.06757: calling self._execute() 12613 1727096145.06829: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096145.06833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096145.06843: variable 'omit' from source: magic vars 12613 1727096145.07159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096145.08961: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096145.09011: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096145.09039: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096145.09067: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096145.09088: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096145.09147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096145.09173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096145.09192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096145.09217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096145.09228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096145.09398: variable 'ansible_distribution' from source: facts 12613 1727096145.09401: variable 'ansible_distribution_major_version' from source: facts 12613 1727096145.09404: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096145.09406: when evaluation is False, skipping this task 12613 1727096145.09408: _execute() done 12613 1727096145.09409: dumping result to json 12613 1727096145.09412: done dumping result, returning 12613 1727096145.09414: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-a9dd-d073-0000000000d8] 12613 1727096145.09416: sending task result for task 0afff68d-5257-a9dd-d073-0000000000d8 12613 1727096145.09545: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000d8 12613 1727096145.09548: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096145.09632: no more pending results, returning what we have 12613 1727096145.09635: results queue empty 12613 1727096145.09636: checking for any_errors_fatal 12613 1727096145.09642: done checking for any_errors_fatal 12613 1727096145.09642: checking for max_fail_percentage 12613 1727096145.09645: done checking for max_fail_percentage 12613 1727096145.09645: checking to see if all hosts have failed and the running result is not ok 12613 1727096145.09646: done checking to see if all hosts have failed 12613 1727096145.09647: getting the remaining hosts for this loop 12613 1727096145.09648: done getting the remaining hosts for this loop 12613 1727096145.09654: getting the next task for host managed_node1 12613 1727096145.09659: done getting next task for host managed_node1 12613 1727096145.09663: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12613 1727096145.09666: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096145.09683: getting variables 12613 1727096145.09685: in VariableManager get_vars() 12613 1727096145.09733: Calling all_inventory to load vars for managed_node1 12613 1727096145.09736: Calling groups_inventory to load vars for managed_node1 12613 1727096145.09738: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096145.09762: Calling all_plugins_play to load vars for managed_node1 12613 1727096145.09766: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096145.09773: Calling groups_plugins_play to load vars for managed_node1 12613 1727096145.10017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096145.10236: done with get_vars() 12613 1727096145.10247: done getting variables 12613 1727096145.10318: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:55:45 -0400 (0:00:00.041) 0:00:08.740 ****** 12613 1727096145.10355: entering _queue_task() for managed_node1/package 12613 1727096145.10778: worker is 1 (out of 1 available) 12613 1727096145.10793: exiting _queue_task() for managed_node1/package 12613 1727096145.10805: done queuing things up, now waiting for results queue to drain 12613 1727096145.10807: waiting for pending results... 12613 1727096145.10979: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12613 1727096145.11064: in run() - task 0afff68d-5257-a9dd-d073-0000000000d9 12613 1727096145.11076: variable 'ansible_search_path' from source: unknown 12613 1727096145.11080: variable 'ansible_search_path' from source: unknown 12613 1727096145.11111: calling self._execute() 12613 1727096145.11176: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096145.11180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096145.11192: variable 'omit' from source: magic vars 12613 1727096145.11509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096145.14925: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096145.15032: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096145.15059: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096145.15120: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096145.15373: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096145.15377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096145.15380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096145.15383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096145.15385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096145.15388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096145.15547: variable 'ansible_distribution' from source: facts 12613 1727096145.15551: variable 'ansible_distribution_major_version' from source: facts 12613 1727096145.15553: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096145.15555: when evaluation is False, skipping this task 12613 1727096145.15557: _execute() done 12613 1727096145.15559: dumping result to json 12613 1727096145.15562: done dumping result, returning 12613 1727096145.15576: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-a9dd-d073-0000000000d9] 12613 1727096145.15584: sending task result for task 0afff68d-5257-a9dd-d073-0000000000d9 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096145.15740: no more pending results, returning what we have 12613 1727096145.15744: results queue empty 12613 1727096145.15745: checking for any_errors_fatal 12613 1727096145.15753: done checking for any_errors_fatal 12613 1727096145.15754: checking for max_fail_percentage 12613 1727096145.15756: done checking for max_fail_percentage 12613 1727096145.15757: checking to see if all hosts have failed and the running result is not ok 12613 1727096145.15758: done checking to see if all hosts have failed 12613 1727096145.15759: getting the remaining hosts for this loop 12613 1727096145.15760: done getting the remaining hosts for this loop 12613 1727096145.15765: getting the next task for host managed_node1 12613 1727096145.15773: done getting next task for host managed_node1 12613 1727096145.15778: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12613 1727096145.15781: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096145.15800: getting variables 12613 1727096145.15802: in VariableManager get_vars() 12613 1727096145.15861: Calling all_inventory to load vars for managed_node1 12613 1727096145.15864: Calling groups_inventory to load vars for managed_node1 12613 1727096145.15868: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096145.16085: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000d9 12613 1727096145.16088: WORKER PROCESS EXITING 12613 1727096145.16099: Calling all_plugins_play to load vars for managed_node1 12613 1727096145.16103: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096145.16107: Calling groups_plugins_play to load vars for managed_node1 12613 1727096145.16427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096145.16712: done with get_vars() 12613 1727096145.16727: done getting variables 12613 1727096145.16787: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:55:45 -0400 (0:00:00.064) 0:00:08.805 ****** 12613 1727096145.16838: entering _queue_task() for managed_node1/service 12613 1727096145.17180: worker is 1 (out of 1 available) 12613 1727096145.17193: exiting _queue_task() for managed_node1/service 12613 1727096145.17207: done queuing things up, now waiting for results queue to drain 12613 1727096145.17208: waiting for pending results... 12613 1727096145.17502: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12613 1727096145.17636: in run() - task 0afff68d-5257-a9dd-d073-0000000000da 12613 1727096145.17657: variable 'ansible_search_path' from source: unknown 12613 1727096145.17665: variable 'ansible_search_path' from source: unknown 12613 1727096145.17714: calling self._execute() 12613 1727096145.17818: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096145.17832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096145.17848: variable 'omit' from source: magic vars 12613 1727096145.18509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096145.20940: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096145.21052: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096145.21107: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096145.21146: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096145.21185: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096145.21274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096145.21325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096145.21357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096145.21413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096145.21437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096145.21626: variable 'ansible_distribution' from source: facts 12613 1727096145.21630: variable 'ansible_distribution_major_version' from source: facts 12613 1727096145.21632: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096145.21634: when evaluation is False, skipping this task 12613 1727096145.21636: _execute() done 12613 1727096145.21638: dumping result to json 12613 1727096145.21640: done dumping result, returning 12613 1727096145.21643: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-a9dd-d073-0000000000da] 12613 1727096145.21653: sending task result for task 0afff68d-5257-a9dd-d073-0000000000da skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096145.21902: no more pending results, returning what we have 12613 1727096145.21905: results queue empty 12613 1727096145.21906: checking for any_errors_fatal 12613 1727096145.21912: done checking for any_errors_fatal 12613 1727096145.21913: checking for max_fail_percentage 12613 1727096145.21915: done checking for max_fail_percentage 12613 1727096145.21916: checking to see if all hosts have failed and the running result is not ok 12613 1727096145.21916: done checking to see if all hosts have failed 12613 1727096145.21917: getting the remaining hosts for this loop 12613 1727096145.21918: done getting the remaining hosts for this loop 12613 1727096145.21922: getting the next task for host managed_node1 12613 1727096145.21929: done getting next task for host managed_node1 12613 1727096145.21934: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12613 1727096145.21937: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096145.21971: getting variables 12613 1727096145.21974: in VariableManager get_vars() 12613 1727096145.22025: Calling all_inventory to load vars for managed_node1 12613 1727096145.22027: Calling groups_inventory to load vars for managed_node1 12613 1727096145.22030: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096145.22040: Calling all_plugins_play to load vars for managed_node1 12613 1727096145.22043: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096145.22045: Calling groups_plugins_play to load vars for managed_node1 12613 1727096145.22252: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000da 12613 1727096145.22255: WORKER PROCESS EXITING 12613 1727096145.22289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096145.22431: done with get_vars() 12613 1727096145.22439: done getting variables 12613 1727096145.22493: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:55:45 -0400 (0:00:00.056) 0:00:08.862 ****** 12613 1727096145.22517: entering _queue_task() for managed_node1/service 12613 1727096145.22750: worker is 1 (out of 1 available) 12613 1727096145.22770: exiting _queue_task() for managed_node1/service 12613 1727096145.22781: done queuing things up, now waiting for results queue to drain 12613 1727096145.22783: waiting for pending results... 12613 1727096145.22954: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12613 1727096145.23038: in run() - task 0afff68d-5257-a9dd-d073-0000000000db 12613 1727096145.23050: variable 'ansible_search_path' from source: unknown 12613 1727096145.23056: variable 'ansible_search_path' from source: unknown 12613 1727096145.23086: calling self._execute() 12613 1727096145.23150: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096145.23156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096145.23164: variable 'omit' from source: magic vars 12613 1727096145.23482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096145.25875: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096145.25879: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096145.25882: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096145.25885: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096145.25918: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096145.26008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096145.26028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096145.26045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096145.26079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096145.26097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096145.26195: variable 'ansible_distribution' from source: facts 12613 1727096145.26199: variable 'ansible_distribution_major_version' from source: facts 12613 1727096145.26214: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096145.26217: when evaluation is False, skipping this task 12613 1727096145.26219: _execute() done 12613 1727096145.26222: dumping result to json 12613 1727096145.26224: done dumping result, returning 12613 1727096145.26233: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-a9dd-d073-0000000000db] 12613 1727096145.26237: sending task result for task 0afff68d-5257-a9dd-d073-0000000000db 12613 1727096145.26326: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000db 12613 1727096145.26329: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12613 1727096145.26380: no more pending results, returning what we have 12613 1727096145.26383: results queue empty 12613 1727096145.26384: checking for any_errors_fatal 12613 1727096145.26390: done checking for any_errors_fatal 12613 1727096145.26391: checking for max_fail_percentage 12613 1727096145.26393: done checking for max_fail_percentage 12613 1727096145.26394: checking to see if all hosts have failed and the running result is not ok 12613 1727096145.26394: done checking to see if all hosts have failed 12613 1727096145.26395: getting the remaining hosts for this loop 12613 1727096145.26396: done getting the remaining hosts for this loop 12613 1727096145.26400: getting the next task for host managed_node1 12613 1727096145.26406: done getting next task for host managed_node1 12613 1727096145.26410: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12613 1727096145.26412: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096145.26429: getting variables 12613 1727096145.26431: in VariableManager get_vars() 12613 1727096145.26489: Calling all_inventory to load vars for managed_node1 12613 1727096145.26493: Calling groups_inventory to load vars for managed_node1 12613 1727096145.26495: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096145.26503: Calling all_plugins_play to load vars for managed_node1 12613 1727096145.26505: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096145.26507: Calling groups_plugins_play to load vars for managed_node1 12613 1727096145.26646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096145.26785: done with get_vars() 12613 1727096145.26794: done getting variables 12613 1727096145.26837: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:55:45 -0400 (0:00:00.043) 0:00:08.905 ****** 12613 1727096145.26862: entering _queue_task() for managed_node1/service 12613 1727096145.27073: worker is 1 (out of 1 available) 12613 1727096145.27087: exiting _queue_task() for managed_node1/service 12613 1727096145.27099: done queuing things up, now waiting for results queue to drain 12613 1727096145.27100: waiting for pending results... 12613 1727096145.27269: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12613 1727096145.27342: in run() - task 0afff68d-5257-a9dd-d073-0000000000dc 12613 1727096145.27354: variable 'ansible_search_path' from source: unknown 12613 1727096145.27358: variable 'ansible_search_path' from source: unknown 12613 1727096145.27390: calling self._execute() 12613 1727096145.27458: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096145.27461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096145.27471: variable 'omit' from source: magic vars 12613 1727096145.27783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096145.29984: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096145.30028: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096145.30059: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096145.30084: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096145.30103: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096145.30166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096145.30189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096145.30206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096145.30232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096145.30243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096145.30340: variable 'ansible_distribution' from source: facts 12613 1727096145.30343: variable 'ansible_distribution_major_version' from source: facts 12613 1727096145.30363: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096145.30366: when evaluation is False, skipping this task 12613 1727096145.30370: _execute() done 12613 1727096145.30372: dumping result to json 12613 1727096145.30374: done dumping result, returning 12613 1727096145.30382: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-a9dd-d073-0000000000dc] 12613 1727096145.30385: sending task result for task 0afff68d-5257-a9dd-d073-0000000000dc skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096145.30512: no more pending results, returning what we have 12613 1727096145.30515: results queue empty 12613 1727096145.30516: checking for any_errors_fatal 12613 1727096145.30522: done checking for any_errors_fatal 12613 1727096145.30523: checking for max_fail_percentage 12613 1727096145.30525: done checking for max_fail_percentage 12613 1727096145.30526: checking to see if all hosts have failed and the running result is not ok 12613 1727096145.30526: done checking to see if all hosts have failed 12613 1727096145.30527: getting the remaining hosts for this loop 12613 1727096145.30529: done getting the remaining hosts for this loop 12613 1727096145.30532: getting the next task for host managed_node1 12613 1727096145.30538: done getting next task for host managed_node1 12613 1727096145.30542: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12613 1727096145.30545: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096145.30565: getting variables 12613 1727096145.30566: in VariableManager get_vars() 12613 1727096145.30625: Calling all_inventory to load vars for managed_node1 12613 1727096145.30629: Calling groups_inventory to load vars for managed_node1 12613 1727096145.30631: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096145.30638: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000dc 12613 1727096145.30640: WORKER PROCESS EXITING 12613 1727096145.30648: Calling all_plugins_play to load vars for managed_node1 12613 1727096145.30650: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096145.30656: Calling groups_plugins_play to load vars for managed_node1 12613 1727096145.31012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096145.31131: done with get_vars() 12613 1727096145.31139: done getting variables 12613 1727096145.31185: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:55:45 -0400 (0:00:00.043) 0:00:08.949 ****** 12613 1727096145.31206: entering _queue_task() for managed_node1/service 12613 1727096145.31438: worker is 1 (out of 1 available) 12613 1727096145.31454: exiting _queue_task() for managed_node1/service 12613 1727096145.31465: done queuing things up, now waiting for results queue to drain 12613 1727096145.31466: waiting for pending results... 12613 1727096145.31637: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 12613 1727096145.31732: in run() - task 0afff68d-5257-a9dd-d073-0000000000dd 12613 1727096145.31746: variable 'ansible_search_path' from source: unknown 12613 1727096145.31750: variable 'ansible_search_path' from source: unknown 12613 1727096145.31780: calling self._execute() 12613 1727096145.31846: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096145.31849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096145.31860: variable 'omit' from source: magic vars 12613 1727096145.32170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096145.33724: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096145.33782: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096145.33808: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096145.33835: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096145.33857: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096145.33918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096145.33937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096145.33957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096145.33987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096145.33998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096145.34098: variable 'ansible_distribution' from source: facts 12613 1727096145.34102: variable 'ansible_distribution_major_version' from source: facts 12613 1727096145.34117: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096145.34120: when evaluation is False, skipping this task 12613 1727096145.34123: _execute() done 12613 1727096145.34126: dumping result to json 12613 1727096145.34128: done dumping result, returning 12613 1727096145.34135: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-a9dd-d073-0000000000dd] 12613 1727096145.34139: sending task result for task 0afff68d-5257-a9dd-d073-0000000000dd 12613 1727096145.34228: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000dd 12613 1727096145.34231: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12613 1727096145.34275: no more pending results, returning what we have 12613 1727096145.34279: results queue empty 12613 1727096145.34280: checking for any_errors_fatal 12613 1727096145.34286: done checking for any_errors_fatal 12613 1727096145.34286: checking for max_fail_percentage 12613 1727096145.34288: done checking for max_fail_percentage 12613 1727096145.34289: checking to see if all hosts have failed and the running result is not ok 12613 1727096145.34290: done checking to see if all hosts have failed 12613 1727096145.34290: getting the remaining hosts for this loop 12613 1727096145.34292: done getting the remaining hosts for this loop 12613 1727096145.34295: getting the next task for host managed_node1 12613 1727096145.34301: done getting next task for host managed_node1 12613 1727096145.34305: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12613 1727096145.34307: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096145.34325: getting variables 12613 1727096145.34327: in VariableManager get_vars() 12613 1727096145.34381: Calling all_inventory to load vars for managed_node1 12613 1727096145.34384: Calling groups_inventory to load vars for managed_node1 12613 1727096145.34386: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096145.34395: Calling all_plugins_play to load vars for managed_node1 12613 1727096145.34397: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096145.34400: Calling groups_plugins_play to load vars for managed_node1 12613 1727096145.34542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096145.34683: done with get_vars() 12613 1727096145.34695: done getting variables 12613 1727096145.34736: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:55:45 -0400 (0:00:00.035) 0:00:08.984 ****** 12613 1727096145.34762: entering _queue_task() for managed_node1/copy 12613 1727096145.34987: worker is 1 (out of 1 available) 12613 1727096145.35001: exiting _queue_task() for managed_node1/copy 12613 1727096145.35013: done queuing things up, now waiting for results queue to drain 12613 1727096145.35014: waiting for pending results... 12613 1727096145.35201: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12613 1727096145.35294: in run() - task 0afff68d-5257-a9dd-d073-0000000000de 12613 1727096145.35305: variable 'ansible_search_path' from source: unknown 12613 1727096145.35308: variable 'ansible_search_path' from source: unknown 12613 1727096145.35337: calling self._execute() 12613 1727096145.35411: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096145.35415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096145.35425: variable 'omit' from source: magic vars 12613 1727096145.35750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096145.37363: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096145.37412: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096145.37440: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096145.37469: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096145.37489: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096145.37551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096145.37577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096145.37594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096145.37620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096145.37630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096145.37732: variable 'ansible_distribution' from source: facts 12613 1727096145.37735: variable 'ansible_distribution_major_version' from source: facts 12613 1727096145.37754: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096145.37759: when evaluation is False, skipping this task 12613 1727096145.37762: _execute() done 12613 1727096145.37765: dumping result to json 12613 1727096145.37770: done dumping result, returning 12613 1727096145.37777: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-a9dd-d073-0000000000de] 12613 1727096145.37782: sending task result for task 0afff68d-5257-a9dd-d073-0000000000de 12613 1727096145.37876: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000de skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096145.37919: no more pending results, returning what we have 12613 1727096145.37922: results queue empty 12613 1727096145.37923: checking for any_errors_fatal 12613 1727096145.37928: done checking for any_errors_fatal 12613 1727096145.37928: checking for max_fail_percentage 12613 1727096145.37930: done checking for max_fail_percentage 12613 1727096145.37931: checking to see if all hosts have failed and the running result is not ok 12613 1727096145.37932: done checking to see if all hosts have failed 12613 1727096145.37932: getting the remaining hosts for this loop 12613 1727096145.37933: done getting the remaining hosts for this loop 12613 1727096145.37937: getting the next task for host managed_node1 12613 1727096145.37943: done getting next task for host managed_node1 12613 1727096145.37946: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12613 1727096145.37948: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096145.37965: getting variables 12613 1727096145.37969: in VariableManager get_vars() 12613 1727096145.38021: Calling all_inventory to load vars for managed_node1 12613 1727096145.38024: Calling groups_inventory to load vars for managed_node1 12613 1727096145.38026: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096145.38035: Calling all_plugins_play to load vars for managed_node1 12613 1727096145.38037: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096145.38040: Calling groups_plugins_play to load vars for managed_node1 12613 1727096145.38226: WORKER PROCESS EXITING 12613 1727096145.38238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096145.38360: done with get_vars() 12613 1727096145.38370: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:55:45 -0400 (0:00:00.036) 0:00:09.021 ****** 12613 1727096145.38430: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 12613 1727096145.38637: worker is 1 (out of 1 available) 12613 1727096145.38650: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 12613 1727096145.38662: done queuing things up, now waiting for results queue to drain 12613 1727096145.38664: waiting for pending results... 12613 1727096145.38836: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12613 1727096145.38924: in run() - task 0afff68d-5257-a9dd-d073-0000000000df 12613 1727096145.38935: variable 'ansible_search_path' from source: unknown 12613 1727096145.38938: variable 'ansible_search_path' from source: unknown 12613 1727096145.38970: calling self._execute() 12613 1727096145.39049: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096145.39056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096145.39063: variable 'omit' from source: magic vars 12613 1727096145.39381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096145.40921: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096145.40978: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096145.41007: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096145.41032: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096145.41055: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096145.41116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096145.41137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096145.41156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096145.41186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096145.41197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096145.41297: variable 'ansible_distribution' from source: facts 12613 1727096145.41301: variable 'ansible_distribution_major_version' from source: facts 12613 1727096145.41317: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096145.41319: when evaluation is False, skipping this task 12613 1727096145.41322: _execute() done 12613 1727096145.41324: dumping result to json 12613 1727096145.41326: done dumping result, returning 12613 1727096145.41334: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-a9dd-d073-0000000000df] 12613 1727096145.41338: sending task result for task 0afff68d-5257-a9dd-d073-0000000000df 12613 1727096145.41430: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000df 12613 1727096145.41432: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096145.41483: no more pending results, returning what we have 12613 1727096145.41486: results queue empty 12613 1727096145.41487: checking for any_errors_fatal 12613 1727096145.41494: done checking for any_errors_fatal 12613 1727096145.41495: checking for max_fail_percentage 12613 1727096145.41497: done checking for max_fail_percentage 12613 1727096145.41498: checking to see if all hosts have failed and the running result is not ok 12613 1727096145.41498: done checking to see if all hosts have failed 12613 1727096145.41499: getting the remaining hosts for this loop 12613 1727096145.41500: done getting the remaining hosts for this loop 12613 1727096145.41503: getting the next task for host managed_node1 12613 1727096145.41509: done getting next task for host managed_node1 12613 1727096145.41513: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12613 1727096145.41515: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096145.41533: getting variables 12613 1727096145.41534: in VariableManager get_vars() 12613 1727096145.41592: Calling all_inventory to load vars for managed_node1 12613 1727096145.41595: Calling groups_inventory to load vars for managed_node1 12613 1727096145.41597: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096145.41605: Calling all_plugins_play to load vars for managed_node1 12613 1727096145.41607: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096145.41610: Calling groups_plugins_play to load vars for managed_node1 12613 1727096145.41748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096145.41880: done with get_vars() 12613 1727096145.41889: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:55:45 -0400 (0:00:00.035) 0:00:09.056 ****** 12613 1727096145.41950: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 12613 1727096145.42159: worker is 1 (out of 1 available) 12613 1727096145.42171: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 12613 1727096145.42183: done queuing things up, now waiting for results queue to drain 12613 1727096145.42185: waiting for pending results... 12613 1727096145.42350: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 12613 1727096145.42434: in run() - task 0afff68d-5257-a9dd-d073-0000000000e0 12613 1727096145.42446: variable 'ansible_search_path' from source: unknown 12613 1727096145.42450: variable 'ansible_search_path' from source: unknown 12613 1727096145.42484: calling self._execute() 12613 1727096145.42552: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096145.42559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096145.42569: variable 'omit' from source: magic vars 12613 1727096145.42884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096145.44443: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096145.44494: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096145.44522: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096145.44547: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096145.44570: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096145.44631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096145.44652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096145.44674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096145.44704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096145.44715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096145.44816: variable 'ansible_distribution' from source: facts 12613 1727096145.44819: variable 'ansible_distribution_major_version' from source: facts 12613 1727096145.44834: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096145.44837: when evaluation is False, skipping this task 12613 1727096145.44840: _execute() done 12613 1727096145.44842: dumping result to json 12613 1727096145.44845: done dumping result, returning 12613 1727096145.44852: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-a9dd-d073-0000000000e0] 12613 1727096145.44859: sending task result for task 0afff68d-5257-a9dd-d073-0000000000e0 12613 1727096145.44948: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000e0 12613 1727096145.44951: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096145.44998: no more pending results, returning what we have 12613 1727096145.45001: results queue empty 12613 1727096145.45002: checking for any_errors_fatal 12613 1727096145.45008: done checking for any_errors_fatal 12613 1727096145.45008: checking for max_fail_percentage 12613 1727096145.45010: done checking for max_fail_percentage 12613 1727096145.45011: checking to see if all hosts have failed and the running result is not ok 12613 1727096145.45012: done checking to see if all hosts have failed 12613 1727096145.45012: getting the remaining hosts for this loop 12613 1727096145.45014: done getting the remaining hosts for this loop 12613 1727096145.45017: getting the next task for host managed_node1 12613 1727096145.45024: done getting next task for host managed_node1 12613 1727096145.45028: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12613 1727096145.45031: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096145.45048: getting variables 12613 1727096145.45050: in VariableManager get_vars() 12613 1727096145.45104: Calling all_inventory to load vars for managed_node1 12613 1727096145.45106: Calling groups_inventory to load vars for managed_node1 12613 1727096145.45109: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096145.45118: Calling all_plugins_play to load vars for managed_node1 12613 1727096145.45120: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096145.45123: Calling groups_plugins_play to load vars for managed_node1 12613 1727096145.45331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096145.45458: done with get_vars() 12613 1727096145.45466: done getting variables 12613 1727096145.45512: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:55:45 -0400 (0:00:00.035) 0:00:09.092 ****** 12613 1727096145.45535: entering _queue_task() for managed_node1/debug 12613 1727096145.45759: worker is 1 (out of 1 available) 12613 1727096145.45774: exiting _queue_task() for managed_node1/debug 12613 1727096145.45785: done queuing things up, now waiting for results queue to drain 12613 1727096145.45787: waiting for pending results... 12613 1727096145.45963: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12613 1727096145.46041: in run() - task 0afff68d-5257-a9dd-d073-0000000000e1 12613 1727096145.46055: variable 'ansible_search_path' from source: unknown 12613 1727096145.46059: variable 'ansible_search_path' from source: unknown 12613 1727096145.46087: calling self._execute() 12613 1727096145.46237: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096145.46242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096145.46245: variable 'omit' from source: magic vars 12613 1727096145.46488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096145.48457: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096145.48516: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096145.48543: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096145.48572: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096145.48592: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096145.48653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096145.48678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096145.48696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096145.48725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096145.48735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096145.48835: variable 'ansible_distribution' from source: facts 12613 1727096145.48839: variable 'ansible_distribution_major_version' from source: facts 12613 1727096145.48856: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096145.48860: when evaluation is False, skipping this task 12613 1727096145.48862: _execute() done 12613 1727096145.48864: dumping result to json 12613 1727096145.48870: done dumping result, returning 12613 1727096145.48877: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-a9dd-d073-0000000000e1] 12613 1727096145.48881: sending task result for task 0afff68d-5257-a9dd-d073-0000000000e1 12613 1727096145.48961: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000e1 12613 1727096145.48964: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096145.49010: no more pending results, returning what we have 12613 1727096145.49014: results queue empty 12613 1727096145.49015: checking for any_errors_fatal 12613 1727096145.49020: done checking for any_errors_fatal 12613 1727096145.49020: checking for max_fail_percentage 12613 1727096145.49022: done checking for max_fail_percentage 12613 1727096145.49023: checking to see if all hosts have failed and the running result is not ok 12613 1727096145.49023: done checking to see if all hosts have failed 12613 1727096145.49024: getting the remaining hosts for this loop 12613 1727096145.49025: done getting the remaining hosts for this loop 12613 1727096145.49030: getting the next task for host managed_node1 12613 1727096145.49036: done getting next task for host managed_node1 12613 1727096145.49040: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12613 1727096145.49042: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096145.49059: getting variables 12613 1727096145.49061: in VariableManager get_vars() 12613 1727096145.49120: Calling all_inventory to load vars for managed_node1 12613 1727096145.49123: Calling groups_inventory to load vars for managed_node1 12613 1727096145.49125: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096145.49134: Calling all_plugins_play to load vars for managed_node1 12613 1727096145.49136: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096145.49138: Calling groups_plugins_play to load vars for managed_node1 12613 1727096145.49281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096145.49411: done with get_vars() 12613 1727096145.49420: done getting variables 12613 1727096145.49463: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:55:45 -0400 (0:00:00.039) 0:00:09.132 ****** 12613 1727096145.49488: entering _queue_task() for managed_node1/debug 12613 1727096145.49692: worker is 1 (out of 1 available) 12613 1727096145.49704: exiting _queue_task() for managed_node1/debug 12613 1727096145.49717: done queuing things up, now waiting for results queue to drain 12613 1727096145.49718: waiting for pending results... 12613 1727096145.50009: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12613 1727096145.50107: in run() - task 0afff68d-5257-a9dd-d073-0000000000e2 12613 1727096145.50138: variable 'ansible_search_path' from source: unknown 12613 1727096145.50146: variable 'ansible_search_path' from source: unknown 12613 1727096145.50189: calling self._execute() 12613 1727096145.50417: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096145.50436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096145.50539: variable 'omit' from source: magic vars 12613 1727096145.51029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096145.52962: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096145.53009: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096145.53041: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096145.53068: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096145.53091: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096145.53149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096145.53171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096145.53189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096145.53216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096145.53227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096145.53326: variable 'ansible_distribution' from source: facts 12613 1727096145.53330: variable 'ansible_distribution_major_version' from source: facts 12613 1727096145.53345: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096145.53348: when evaluation is False, skipping this task 12613 1727096145.53350: _execute() done 12613 1727096145.53356: dumping result to json 12613 1727096145.53358: done dumping result, returning 12613 1727096145.53369: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-a9dd-d073-0000000000e2] 12613 1727096145.53372: sending task result for task 0afff68d-5257-a9dd-d073-0000000000e2 12613 1727096145.53456: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000e2 12613 1727096145.53460: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096145.53523: no more pending results, returning what we have 12613 1727096145.53526: results queue empty 12613 1727096145.53527: checking for any_errors_fatal 12613 1727096145.53531: done checking for any_errors_fatal 12613 1727096145.53532: checking for max_fail_percentage 12613 1727096145.53534: done checking for max_fail_percentage 12613 1727096145.53535: checking to see if all hosts have failed and the running result is not ok 12613 1727096145.53536: done checking to see if all hosts have failed 12613 1727096145.53536: getting the remaining hosts for this loop 12613 1727096145.53537: done getting the remaining hosts for this loop 12613 1727096145.53541: getting the next task for host managed_node1 12613 1727096145.53547: done getting next task for host managed_node1 12613 1727096145.53551: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12613 1727096145.53556: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096145.53575: getting variables 12613 1727096145.53578: in VariableManager get_vars() 12613 1727096145.53626: Calling all_inventory to load vars for managed_node1 12613 1727096145.53628: Calling groups_inventory to load vars for managed_node1 12613 1727096145.53631: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096145.53639: Calling all_plugins_play to load vars for managed_node1 12613 1727096145.53641: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096145.53643: Calling groups_plugins_play to load vars for managed_node1 12613 1727096145.53926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096145.54124: done with get_vars() 12613 1727096145.54136: done getting variables 12613 1727096145.54194: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:55:45 -0400 (0:00:00.047) 0:00:09.179 ****** 12613 1727096145.54227: entering _queue_task() for managed_node1/debug 12613 1727096145.54511: worker is 1 (out of 1 available) 12613 1727096145.54523: exiting _queue_task() for managed_node1/debug 12613 1727096145.54534: done queuing things up, now waiting for results queue to drain 12613 1727096145.54536: waiting for pending results... 12613 1727096145.54833: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12613 1727096145.54930: in run() - task 0afff68d-5257-a9dd-d073-0000000000e3 12613 1727096145.54941: variable 'ansible_search_path' from source: unknown 12613 1727096145.54946: variable 'ansible_search_path' from source: unknown 12613 1727096145.54987: calling self._execute() 12613 1727096145.55065: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096145.55070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096145.55081: variable 'omit' from source: magic vars 12613 1727096145.55402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096145.57087: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096145.57134: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096145.57173: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096145.57247: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096145.57285: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096145.57381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096145.57437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096145.57542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096145.57546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096145.57548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096145.57696: variable 'ansible_distribution' from source: facts 12613 1727096145.57706: variable 'ansible_distribution_major_version' from source: facts 12613 1727096145.57727: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096145.57735: when evaluation is False, skipping this task 12613 1727096145.57742: _execute() done 12613 1727096145.57758: dumping result to json 12613 1727096145.57773: done dumping result, returning 12613 1727096145.57785: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-a9dd-d073-0000000000e3] 12613 1727096145.57795: sending task result for task 0afff68d-5257-a9dd-d073-0000000000e3 12613 1727096145.58079: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000e3 skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096145.58124: no more pending results, returning what we have 12613 1727096145.58127: results queue empty 12613 1727096145.58128: checking for any_errors_fatal 12613 1727096145.58133: done checking for any_errors_fatal 12613 1727096145.58134: checking for max_fail_percentage 12613 1727096145.58136: done checking for max_fail_percentage 12613 1727096145.58136: checking to see if all hosts have failed and the running result is not ok 12613 1727096145.58137: done checking to see if all hosts have failed 12613 1727096145.58138: getting the remaining hosts for this loop 12613 1727096145.58139: done getting the remaining hosts for this loop 12613 1727096145.58142: getting the next task for host managed_node1 12613 1727096145.58149: done getting next task for host managed_node1 12613 1727096145.58155: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12613 1727096145.58157: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096145.58177: getting variables 12613 1727096145.58178: in VariableManager get_vars() 12613 1727096145.58247: Calling all_inventory to load vars for managed_node1 12613 1727096145.58250: Calling groups_inventory to load vars for managed_node1 12613 1727096145.58254: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096145.58260: WORKER PROCESS EXITING 12613 1727096145.58270: Calling all_plugins_play to load vars for managed_node1 12613 1727096145.58306: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096145.58326: Calling groups_plugins_play to load vars for managed_node1 12613 1727096145.58459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096145.58591: done with get_vars() 12613 1727096145.58599: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:55:45 -0400 (0:00:00.044) 0:00:09.223 ****** 12613 1727096145.58677: entering _queue_task() for managed_node1/ping 12613 1727096145.58931: worker is 1 (out of 1 available) 12613 1727096145.58943: exiting _queue_task() for managed_node1/ping 12613 1727096145.58957: done queuing things up, now waiting for results queue to drain 12613 1727096145.58959: waiting for pending results... 12613 1727096145.59117: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 12613 1727096145.59196: in run() - task 0afff68d-5257-a9dd-d073-0000000000e4 12613 1727096145.59209: variable 'ansible_search_path' from source: unknown 12613 1727096145.59212: variable 'ansible_search_path' from source: unknown 12613 1727096145.59241: calling self._execute() 12613 1727096145.59310: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096145.59315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096145.59436: variable 'omit' from source: magic vars 12613 1727096145.59639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096145.62317: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096145.62404: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096145.62471: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096145.62537: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096145.62584: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096145.62645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096145.62671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096145.62694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096145.62727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096145.62737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096145.62847: variable 'ansible_distribution' from source: facts 12613 1727096145.62850: variable 'ansible_distribution_major_version' from source: facts 12613 1727096145.62871: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096145.62874: when evaluation is False, skipping this task 12613 1727096145.62877: _execute() done 12613 1727096145.62879: dumping result to json 12613 1727096145.62882: done dumping result, returning 12613 1727096145.62894: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-a9dd-d073-0000000000e4] 12613 1727096145.62897: sending task result for task 0afff68d-5257-a9dd-d073-0000000000e4 12613 1727096145.62982: done sending task result for task 0afff68d-5257-a9dd-d073-0000000000e4 12613 1727096145.62984: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096145.63035: no more pending results, returning what we have 12613 1727096145.63038: results queue empty 12613 1727096145.63039: checking for any_errors_fatal 12613 1727096145.63045: done checking for any_errors_fatal 12613 1727096145.63046: checking for max_fail_percentage 12613 1727096145.63047: done checking for max_fail_percentage 12613 1727096145.63048: checking to see if all hosts have failed and the running result is not ok 12613 1727096145.63049: done checking to see if all hosts have failed 12613 1727096145.63050: getting the remaining hosts for this loop 12613 1727096145.63051: done getting the remaining hosts for this loop 12613 1727096145.63054: getting the next task for host managed_node1 12613 1727096145.63063: done getting next task for host managed_node1 12613 1727096145.63065: ^ task is: TASK: meta (role_complete) 12613 1727096145.63069: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096145.63088: getting variables 12613 1727096145.63090: in VariableManager get_vars() 12613 1727096145.63146: Calling all_inventory to load vars for managed_node1 12613 1727096145.63149: Calling groups_inventory to load vars for managed_node1 12613 1727096145.63151: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096145.63160: Calling all_plugins_play to load vars for managed_node1 12613 1727096145.63163: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096145.63165: Calling groups_plugins_play to load vars for managed_node1 12613 1727096145.63349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096145.63472: done with get_vars() 12613 1727096145.63481: done getting variables 12613 1727096145.63533: done queuing things up, now waiting for results queue to drain 12613 1727096145.63535: results queue empty 12613 1727096145.63535: checking for any_errors_fatal 12613 1727096145.63537: done checking for any_errors_fatal 12613 1727096145.63537: checking for max_fail_percentage 12613 1727096145.63539: done checking for max_fail_percentage 12613 1727096145.63539: checking to see if all hosts have failed and the running result is not ok 12613 1727096145.63540: done checking to see if all hosts have failed 12613 1727096145.63541: getting the remaining hosts for this loop 12613 1727096145.63542: done getting the remaining hosts for this loop 12613 1727096145.63544: getting the next task for host managed_node1 12613 1727096145.63548: done getting next task for host managed_node1 12613 1727096145.63550: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12613 1727096145.63551: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096145.63558: getting variables 12613 1727096145.63559: in VariableManager get_vars() 12613 1727096145.63574: Calling all_inventory to load vars for managed_node1 12613 1727096145.63576: Calling groups_inventory to load vars for managed_node1 12613 1727096145.63577: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096145.63585: Calling all_plugins_play to load vars for managed_node1 12613 1727096145.63587: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096145.63588: Calling groups_plugins_play to load vars for managed_node1 12613 1727096145.63669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096145.63783: done with get_vars() 12613 1727096145.63789: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:55:45 -0400 (0:00:00.051) 0:00:09.275 ****** 12613 1727096145.63839: entering _queue_task() for managed_node1/include_tasks 12613 1727096145.64063: worker is 1 (out of 1 available) 12613 1727096145.64078: exiting _queue_task() for managed_node1/include_tasks 12613 1727096145.64090: done queuing things up, now waiting for results queue to drain 12613 1727096145.64092: waiting for pending results... 12613 1727096145.64405: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12613 1727096145.64453: in run() - task 0afff68d-5257-a9dd-d073-00000000011b 12613 1727096145.64500: variable 'ansible_search_path' from source: unknown 12613 1727096145.64773: variable 'ansible_search_path' from source: unknown 12613 1727096145.64776: calling self._execute() 12613 1727096145.64779: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096145.65074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096145.65078: variable 'omit' from source: magic vars 12613 1727096145.66009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096145.68457: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096145.68541: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096145.68593: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096145.68634: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096145.68664: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096145.68755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096145.68788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096145.68820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096145.68858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096145.68877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096145.69035: variable 'ansible_distribution' from source: facts 12613 1727096145.69047: variable 'ansible_distribution_major_version' from source: facts 12613 1727096145.69074: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096145.69082: when evaluation is False, skipping this task 12613 1727096145.69089: _execute() done 12613 1727096145.69096: dumping result to json 12613 1727096145.69103: done dumping result, returning 12613 1727096145.69115: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-a9dd-d073-00000000011b] 12613 1727096145.69125: sending task result for task 0afff68d-5257-a9dd-d073-00000000011b skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096145.69424: no more pending results, returning what we have 12613 1727096145.69429: results queue empty 12613 1727096145.69430: checking for any_errors_fatal 12613 1727096145.69431: done checking for any_errors_fatal 12613 1727096145.69432: checking for max_fail_percentage 12613 1727096145.69434: done checking for max_fail_percentage 12613 1727096145.69435: checking to see if all hosts have failed and the running result is not ok 12613 1727096145.69435: done checking to see if all hosts have failed 12613 1727096145.69436: getting the remaining hosts for this loop 12613 1727096145.69438: done getting the remaining hosts for this loop 12613 1727096145.69442: getting the next task for host managed_node1 12613 1727096145.69450: done getting next task for host managed_node1 12613 1727096145.69454: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12613 1727096145.69457: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096145.69494: getting variables 12613 1727096145.69496: in VariableManager get_vars() 12613 1727096145.69555: Calling all_inventory to load vars for managed_node1 12613 1727096145.69558: Calling groups_inventory to load vars for managed_node1 12613 1727096145.69561: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096145.69779: Calling all_plugins_play to load vars for managed_node1 12613 1727096145.69783: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096145.69788: Calling groups_plugins_play to load vars for managed_node1 12613 1727096145.69964: done sending task result for task 0afff68d-5257-a9dd-d073-00000000011b 12613 1727096145.69971: WORKER PROCESS EXITING 12613 1727096145.70057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096145.70274: done with get_vars() 12613 1727096145.70286: done getting variables 12613 1727096145.70353: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:55:45 -0400 (0:00:00.065) 0:00:09.341 ****** 12613 1727096145.70390: entering _queue_task() for managed_node1/debug 12613 1727096145.70713: worker is 1 (out of 1 available) 12613 1727096145.70723: exiting _queue_task() for managed_node1/debug 12613 1727096145.70734: done queuing things up, now waiting for results queue to drain 12613 1727096145.70735: waiting for pending results... 12613 1727096145.71011: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 12613 1727096145.71148: in run() - task 0afff68d-5257-a9dd-d073-00000000011c 12613 1727096145.71169: variable 'ansible_search_path' from source: unknown 12613 1727096145.71177: variable 'ansible_search_path' from source: unknown 12613 1727096145.71223: calling self._execute() 12613 1727096145.71318: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096145.71330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096145.71350: variable 'omit' from source: magic vars 12613 1727096145.71805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096145.74159: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096145.74247: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096145.74289: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096145.74340: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096145.74377: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096145.74478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096145.74515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096145.74548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096145.74607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096145.74671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096145.74804: variable 'ansible_distribution' from source: facts 12613 1727096145.74815: variable 'ansible_distribution_major_version' from source: facts 12613 1727096145.74837: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096145.74844: when evaluation is False, skipping this task 12613 1727096145.74851: _execute() done 12613 1727096145.74859: dumping result to json 12613 1727096145.74869: done dumping result, returning 12613 1727096145.74901: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-a9dd-d073-00000000011c] 12613 1727096145.74904: sending task result for task 0afff68d-5257-a9dd-d073-00000000011c skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096145.75155: no more pending results, returning what we have 12613 1727096145.75159: results queue empty 12613 1727096145.75160: checking for any_errors_fatal 12613 1727096145.75166: done checking for any_errors_fatal 12613 1727096145.75169: checking for max_fail_percentage 12613 1727096145.75171: done checking for max_fail_percentage 12613 1727096145.75172: checking to see if all hosts have failed and the running result is not ok 12613 1727096145.75173: done checking to see if all hosts have failed 12613 1727096145.75174: getting the remaining hosts for this loop 12613 1727096145.75175: done getting the remaining hosts for this loop 12613 1727096145.75180: getting the next task for host managed_node1 12613 1727096145.75188: done getting next task for host managed_node1 12613 1727096145.75192: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12613 1727096145.75195: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096145.75217: getting variables 12613 1727096145.75219: in VariableManager get_vars() 12613 1727096145.75399: Calling all_inventory to load vars for managed_node1 12613 1727096145.75402: Calling groups_inventory to load vars for managed_node1 12613 1727096145.75405: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096145.75478: Calling all_plugins_play to load vars for managed_node1 12613 1727096145.75481: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096145.75485: Calling groups_plugins_play to load vars for managed_node1 12613 1727096145.75500: done sending task result for task 0afff68d-5257-a9dd-d073-00000000011c 12613 1727096145.75503: WORKER PROCESS EXITING 12613 1727096145.75770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096145.75988: done with get_vars() 12613 1727096145.76002: done getting variables 12613 1727096145.76072: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:55:45 -0400 (0:00:00.057) 0:00:09.398 ****** 12613 1727096145.76107: entering _queue_task() for managed_node1/fail 12613 1727096145.76439: worker is 1 (out of 1 available) 12613 1727096145.76451: exiting _queue_task() for managed_node1/fail 12613 1727096145.76464: done queuing things up, now waiting for results queue to drain 12613 1727096145.76466: waiting for pending results... 12613 1727096145.76770: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12613 1727096145.76913: in run() - task 0afff68d-5257-a9dd-d073-00000000011d 12613 1727096145.76931: variable 'ansible_search_path' from source: unknown 12613 1727096145.76937: variable 'ansible_search_path' from source: unknown 12613 1727096145.76980: calling self._execute() 12613 1727096145.77078: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096145.77090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096145.77123: variable 'omit' from source: magic vars 12613 1727096145.77547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096145.80056: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096145.80069: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096145.80113: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096145.80154: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096145.80193: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096145.80286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096145.80373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096145.80377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096145.80400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096145.80419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096145.80558: variable 'ansible_distribution' from source: facts 12613 1727096145.80570: variable 'ansible_distribution_major_version' from source: facts 12613 1727096145.80590: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096145.80596: when evaluation is False, skipping this task 12613 1727096145.80602: _execute() done 12613 1727096145.80614: dumping result to json 12613 1727096145.80620: done dumping result, returning 12613 1727096145.80631: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-a9dd-d073-00000000011d] 12613 1727096145.80671: sending task result for task 0afff68d-5257-a9dd-d073-00000000011d 12613 1727096145.80910: done sending task result for task 0afff68d-5257-a9dd-d073-00000000011d 12613 1727096145.80913: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096145.80966: no more pending results, returning what we have 12613 1727096145.80973: results queue empty 12613 1727096145.80974: checking for any_errors_fatal 12613 1727096145.80979: done checking for any_errors_fatal 12613 1727096145.80980: checking for max_fail_percentage 12613 1727096145.80982: done checking for max_fail_percentage 12613 1727096145.80983: checking to see if all hosts have failed and the running result is not ok 12613 1727096145.80984: done checking to see if all hosts have failed 12613 1727096145.80985: getting the remaining hosts for this loop 12613 1727096145.80986: done getting the remaining hosts for this loop 12613 1727096145.80991: getting the next task for host managed_node1 12613 1727096145.81000: done getting next task for host managed_node1 12613 1727096145.81005: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12613 1727096145.81008: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096145.81178: getting variables 12613 1727096145.81180: in VariableManager get_vars() 12613 1727096145.81232: Calling all_inventory to load vars for managed_node1 12613 1727096145.81235: Calling groups_inventory to load vars for managed_node1 12613 1727096145.81237: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096145.81246: Calling all_plugins_play to load vars for managed_node1 12613 1727096145.81248: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096145.81251: Calling groups_plugins_play to load vars for managed_node1 12613 1727096145.81552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096145.81769: done with get_vars() 12613 1727096145.81783: done getting variables 12613 1727096145.81855: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:55:45 -0400 (0:00:00.057) 0:00:09.456 ****** 12613 1727096145.81892: entering _queue_task() for managed_node1/fail 12613 1727096145.82253: worker is 1 (out of 1 available) 12613 1727096145.82376: exiting _queue_task() for managed_node1/fail 12613 1727096145.82388: done queuing things up, now waiting for results queue to drain 12613 1727096145.82389: waiting for pending results... 12613 1727096145.82591: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12613 1727096145.82746: in run() - task 0afff68d-5257-a9dd-d073-00000000011e 12613 1727096145.82769: variable 'ansible_search_path' from source: unknown 12613 1727096145.82809: variable 'ansible_search_path' from source: unknown 12613 1727096145.82831: calling self._execute() 12613 1727096145.82926: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096145.82941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096145.82958: variable 'omit' from source: magic vars 12613 1727096145.83421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096145.86527: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096145.86612: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096145.86661: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096145.86744: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096145.86747: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096145.86818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096145.86859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096145.86894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096145.86939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096145.86965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096145.87111: variable 'ansible_distribution' from source: facts 12613 1727096145.87123: variable 'ansible_distribution_major_version' from source: facts 12613 1727096145.87176: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096145.87182: when evaluation is False, skipping this task 12613 1727096145.87185: _execute() done 12613 1727096145.87188: dumping result to json 12613 1727096145.87190: done dumping result, returning 12613 1727096145.87192: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-a9dd-d073-00000000011e] 12613 1727096145.87194: sending task result for task 0afff68d-5257-a9dd-d073-00000000011e skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096145.87445: no more pending results, returning what we have 12613 1727096145.87449: results queue empty 12613 1727096145.87450: checking for any_errors_fatal 12613 1727096145.87457: done checking for any_errors_fatal 12613 1727096145.87458: checking for max_fail_percentage 12613 1727096145.87459: done checking for max_fail_percentage 12613 1727096145.87460: checking to see if all hosts have failed and the running result is not ok 12613 1727096145.87461: done checking to see if all hosts have failed 12613 1727096145.87462: getting the remaining hosts for this loop 12613 1727096145.87463: done getting the remaining hosts for this loop 12613 1727096145.87466: getting the next task for host managed_node1 12613 1727096145.87475: done getting next task for host managed_node1 12613 1727096145.87479: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12613 1727096145.87481: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096145.87504: getting variables 12613 1727096145.87507: in VariableManager get_vars() 12613 1727096145.87558: Calling all_inventory to load vars for managed_node1 12613 1727096145.87560: Calling groups_inventory to load vars for managed_node1 12613 1727096145.87562: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096145.87728: Calling all_plugins_play to load vars for managed_node1 12613 1727096145.87733: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096145.87737: Calling groups_plugins_play to load vars for managed_node1 12613 1727096145.88093: done sending task result for task 0afff68d-5257-a9dd-d073-00000000011e 12613 1727096145.88097: WORKER PROCESS EXITING 12613 1727096145.88122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096145.88333: done with get_vars() 12613 1727096145.88346: done getting variables 12613 1727096145.88408: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:55:45 -0400 (0:00:00.065) 0:00:09.521 ****** 12613 1727096145.88439: entering _queue_task() for managed_node1/fail 12613 1727096145.88838: worker is 1 (out of 1 available) 12613 1727096145.88851: exiting _queue_task() for managed_node1/fail 12613 1727096145.88863: done queuing things up, now waiting for results queue to drain 12613 1727096145.88865: waiting for pending results... 12613 1727096145.89053: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12613 1727096145.89200: in run() - task 0afff68d-5257-a9dd-d073-00000000011f 12613 1727096145.89220: variable 'ansible_search_path' from source: unknown 12613 1727096145.89256: variable 'ansible_search_path' from source: unknown 12613 1727096145.89277: calling self._execute() 12613 1727096145.89383: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096145.89394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096145.89418: variable 'omit' from source: magic vars 12613 1727096145.89911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096145.93087: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096145.93173: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096145.93254: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096145.93262: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096145.93293: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096145.93385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096145.93417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096145.93474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096145.93496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096145.93515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096145.93652: variable 'ansible_distribution' from source: facts 12613 1727096145.93683: variable 'ansible_distribution_major_version' from source: facts 12613 1727096145.93697: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096145.93705: when evaluation is False, skipping this task 12613 1727096145.93792: _execute() done 12613 1727096145.93797: dumping result to json 12613 1727096145.93799: done dumping result, returning 12613 1727096145.93801: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-a9dd-d073-00000000011f] 12613 1727096145.93803: sending task result for task 0afff68d-5257-a9dd-d073-00000000011f skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096145.93947: no more pending results, returning what we have 12613 1727096145.93951: results queue empty 12613 1727096145.93952: checking for any_errors_fatal 12613 1727096145.93964: done checking for any_errors_fatal 12613 1727096145.93965: checking for max_fail_percentage 12613 1727096145.93969: done checking for max_fail_percentage 12613 1727096145.93970: checking to see if all hosts have failed and the running result is not ok 12613 1727096145.93971: done checking to see if all hosts have failed 12613 1727096145.93972: getting the remaining hosts for this loop 12613 1727096145.93973: done getting the remaining hosts for this loop 12613 1727096145.93978: getting the next task for host managed_node1 12613 1727096145.93985: done getting next task for host managed_node1 12613 1727096145.93989: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12613 1727096145.93991: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096145.94015: getting variables 12613 1727096145.94017: in VariableManager get_vars() 12613 1727096145.94252: Calling all_inventory to load vars for managed_node1 12613 1727096145.94255: Calling groups_inventory to load vars for managed_node1 12613 1727096145.94258: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096145.94349: Calling all_plugins_play to load vars for managed_node1 12613 1727096145.94353: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096145.94356: Calling groups_plugins_play to load vars for managed_node1 12613 1727096145.94712: done sending task result for task 0afff68d-5257-a9dd-d073-00000000011f 12613 1727096145.94715: WORKER PROCESS EXITING 12613 1727096145.94739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096145.94948: done with get_vars() 12613 1727096145.94960: done getting variables 12613 1727096145.95022: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:55:45 -0400 (0:00:00.066) 0:00:09.587 ****** 12613 1727096145.95053: entering _queue_task() for managed_node1/dnf 12613 1727096145.95462: worker is 1 (out of 1 available) 12613 1727096145.95475: exiting _queue_task() for managed_node1/dnf 12613 1727096145.95485: done queuing things up, now waiting for results queue to drain 12613 1727096145.95486: waiting for pending results... 12613 1727096145.95685: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12613 1727096145.95821: in run() - task 0afff68d-5257-a9dd-d073-000000000120 12613 1727096145.95844: variable 'ansible_search_path' from source: unknown 12613 1727096145.95852: variable 'ansible_search_path' from source: unknown 12613 1727096145.95941: calling self._execute() 12613 1727096145.96065: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096145.96081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096145.96096: variable 'omit' from source: magic vars 12613 1727096145.96572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096145.99261: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096145.99349: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096145.99575: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096145.99580: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096145.99673: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096145.99766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096145.99833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096145.99931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096146.00024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096146.00090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096146.00317: variable 'ansible_distribution' from source: facts 12613 1727096146.00347: variable 'ansible_distribution_major_version' from source: facts 12613 1727096146.00372: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096146.00407: when evaluation is False, skipping this task 12613 1727096146.00446: _execute() done 12613 1727096146.00458: dumping result to json 12613 1727096146.00466: done dumping result, returning 12613 1727096146.00487: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-a9dd-d073-000000000120] 12613 1727096146.00496: sending task result for task 0afff68d-5257-a9dd-d073-000000000120 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096146.00751: no more pending results, returning what we have 12613 1727096146.00755: results queue empty 12613 1727096146.00756: checking for any_errors_fatal 12613 1727096146.00766: done checking for any_errors_fatal 12613 1727096146.00769: checking for max_fail_percentage 12613 1727096146.00771: done checking for max_fail_percentage 12613 1727096146.00772: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.00773: done checking to see if all hosts have failed 12613 1727096146.00773: getting the remaining hosts for this loop 12613 1727096146.00775: done getting the remaining hosts for this loop 12613 1727096146.00778: getting the next task for host managed_node1 12613 1727096146.00786: done getting next task for host managed_node1 12613 1727096146.00789: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12613 1727096146.00792: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096146.00812: getting variables 12613 1727096146.00814: in VariableManager get_vars() 12613 1727096146.00866: Calling all_inventory to load vars for managed_node1 12613 1727096146.00984: Calling groups_inventory to load vars for managed_node1 12613 1727096146.00987: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.00998: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.01001: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.01004: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.01312: done sending task result for task 0afff68d-5257-a9dd-d073-000000000120 12613 1727096146.01315: WORKER PROCESS EXITING 12613 1727096146.01340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.01553: done with get_vars() 12613 1727096146.01565: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12613 1727096146.01636: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:55:46 -0400 (0:00:00.066) 0:00:09.653 ****** 12613 1727096146.01665: entering _queue_task() for managed_node1/yum 12613 1727096146.02047: worker is 1 (out of 1 available) 12613 1727096146.02060: exiting _queue_task() for managed_node1/yum 12613 1727096146.02206: done queuing things up, now waiting for results queue to drain 12613 1727096146.02208: waiting for pending results... 12613 1727096146.02688: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12613 1727096146.03028: in run() - task 0afff68d-5257-a9dd-d073-000000000121 12613 1727096146.03033: variable 'ansible_search_path' from source: unknown 12613 1727096146.03036: variable 'ansible_search_path' from source: unknown 12613 1727096146.03039: calling self._execute() 12613 1727096146.03205: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096146.03218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096146.03263: variable 'omit' from source: magic vars 12613 1727096146.04130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096146.06391: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096146.06440: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096146.06471: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096146.06497: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096146.06517: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096146.06582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096146.06603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096146.06621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096146.06649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096146.06662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096146.06763: variable 'ansible_distribution' from source: facts 12613 1727096146.06767: variable 'ansible_distribution_major_version' from source: facts 12613 1727096146.06783: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096146.06788: when evaluation is False, skipping this task 12613 1727096146.06791: _execute() done 12613 1727096146.06793: dumping result to json 12613 1727096146.06796: done dumping result, returning 12613 1727096146.06804: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-a9dd-d073-000000000121] 12613 1727096146.06807: sending task result for task 0afff68d-5257-a9dd-d073-000000000121 12613 1727096146.06898: done sending task result for task 0afff68d-5257-a9dd-d073-000000000121 12613 1727096146.06901: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096146.07006: no more pending results, returning what we have 12613 1727096146.07013: results queue empty 12613 1727096146.07014: checking for any_errors_fatal 12613 1727096146.07020: done checking for any_errors_fatal 12613 1727096146.07021: checking for max_fail_percentage 12613 1727096146.07023: done checking for max_fail_percentage 12613 1727096146.07024: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.07024: done checking to see if all hosts have failed 12613 1727096146.07025: getting the remaining hosts for this loop 12613 1727096146.07026: done getting the remaining hosts for this loop 12613 1727096146.07031: getting the next task for host managed_node1 12613 1727096146.07155: done getting next task for host managed_node1 12613 1727096146.07159: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12613 1727096146.07162: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096146.07383: getting variables 12613 1727096146.07385: in VariableManager get_vars() 12613 1727096146.07434: Calling all_inventory to load vars for managed_node1 12613 1727096146.07437: Calling groups_inventory to load vars for managed_node1 12613 1727096146.07440: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.07448: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.07451: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.07455: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.07914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.08292: done with get_vars() 12613 1727096146.08415: done getting variables 12613 1727096146.08487: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:55:46 -0400 (0:00:00.068) 0:00:09.722 ****** 12613 1727096146.08527: entering _queue_task() for managed_node1/fail 12613 1727096146.08803: worker is 1 (out of 1 available) 12613 1727096146.08817: exiting _queue_task() for managed_node1/fail 12613 1727096146.08828: done queuing things up, now waiting for results queue to drain 12613 1727096146.08829: waiting for pending results... 12613 1727096146.09006: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12613 1727096146.09098: in run() - task 0afff68d-5257-a9dd-d073-000000000122 12613 1727096146.09109: variable 'ansible_search_path' from source: unknown 12613 1727096146.09113: variable 'ansible_search_path' from source: unknown 12613 1727096146.09143: calling self._execute() 12613 1727096146.09215: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096146.09219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096146.09228: variable 'omit' from source: magic vars 12613 1727096146.09546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096146.11575: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096146.11578: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096146.11581: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096146.11583: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096146.11585: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096146.11659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096146.11698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096146.11729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096146.11799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096146.11817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096146.11992: variable 'ansible_distribution' from source: facts 12613 1727096146.11996: variable 'ansible_distribution_major_version' from source: facts 12613 1727096146.12011: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096146.12014: when evaluation is False, skipping this task 12613 1727096146.12017: _execute() done 12613 1727096146.12019: dumping result to json 12613 1727096146.12023: done dumping result, returning 12613 1727096146.12030: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-a9dd-d073-000000000122] 12613 1727096146.12035: sending task result for task 0afff68d-5257-a9dd-d073-000000000122 12613 1727096146.12132: done sending task result for task 0afff68d-5257-a9dd-d073-000000000122 12613 1727096146.12135: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096146.12200: no more pending results, returning what we have 12613 1727096146.12203: results queue empty 12613 1727096146.12204: checking for any_errors_fatal 12613 1727096146.12212: done checking for any_errors_fatal 12613 1727096146.12212: checking for max_fail_percentage 12613 1727096146.12214: done checking for max_fail_percentage 12613 1727096146.12215: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.12216: done checking to see if all hosts have failed 12613 1727096146.12216: getting the remaining hosts for this loop 12613 1727096146.12218: done getting the remaining hosts for this loop 12613 1727096146.12221: getting the next task for host managed_node1 12613 1727096146.12229: done getting next task for host managed_node1 12613 1727096146.12232: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12613 1727096146.12234: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096146.12253: getting variables 12613 1727096146.12255: in VariableManager get_vars() 12613 1727096146.12309: Calling all_inventory to load vars for managed_node1 12613 1727096146.12312: Calling groups_inventory to load vars for managed_node1 12613 1727096146.12314: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.12322: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.12324: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.12327: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.12461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.12585: done with get_vars() 12613 1727096146.12594: done getting variables 12613 1727096146.12636: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:55:46 -0400 (0:00:00.041) 0:00:09.763 ****** 12613 1727096146.12661: entering _queue_task() for managed_node1/package 12613 1727096146.12871: worker is 1 (out of 1 available) 12613 1727096146.12886: exiting _queue_task() for managed_node1/package 12613 1727096146.12898: done queuing things up, now waiting for results queue to drain 12613 1727096146.12899: waiting for pending results... 12613 1727096146.13082: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 12613 1727096146.13170: in run() - task 0afff68d-5257-a9dd-d073-000000000123 12613 1727096146.13182: variable 'ansible_search_path' from source: unknown 12613 1727096146.13186: variable 'ansible_search_path' from source: unknown 12613 1727096146.13217: calling self._execute() 12613 1727096146.13287: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096146.13291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096146.13300: variable 'omit' from source: magic vars 12613 1727096146.13662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096146.15832: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096146.15877: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096146.15905: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096146.15932: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096146.15953: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096146.16016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096146.16036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096146.16059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096146.16087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096146.16098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096146.16198: variable 'ansible_distribution' from source: facts 12613 1727096146.16202: variable 'ansible_distribution_major_version' from source: facts 12613 1727096146.16218: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096146.16220: when evaluation is False, skipping this task 12613 1727096146.16223: _execute() done 12613 1727096146.16226: dumping result to json 12613 1727096146.16228: done dumping result, returning 12613 1727096146.16236: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-a9dd-d073-000000000123] 12613 1727096146.16240: sending task result for task 0afff68d-5257-a9dd-d073-000000000123 12613 1727096146.16332: done sending task result for task 0afff68d-5257-a9dd-d073-000000000123 12613 1727096146.16334: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096146.16394: no more pending results, returning what we have 12613 1727096146.16398: results queue empty 12613 1727096146.16399: checking for any_errors_fatal 12613 1727096146.16404: done checking for any_errors_fatal 12613 1727096146.16405: checking for max_fail_percentage 12613 1727096146.16406: done checking for max_fail_percentage 12613 1727096146.16407: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.16408: done checking to see if all hosts have failed 12613 1727096146.16409: getting the remaining hosts for this loop 12613 1727096146.16410: done getting the remaining hosts for this loop 12613 1727096146.16414: getting the next task for host managed_node1 12613 1727096146.16420: done getting next task for host managed_node1 12613 1727096146.16424: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12613 1727096146.16426: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096146.16444: getting variables 12613 1727096146.16446: in VariableManager get_vars() 12613 1727096146.16505: Calling all_inventory to load vars for managed_node1 12613 1727096146.16508: Calling groups_inventory to load vars for managed_node1 12613 1727096146.16511: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.16520: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.16522: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.16525: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.16709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.16832: done with get_vars() 12613 1727096146.16840: done getting variables 12613 1727096146.16885: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:55:46 -0400 (0:00:00.042) 0:00:09.806 ****** 12613 1727096146.16910: entering _queue_task() for managed_node1/package 12613 1727096146.17136: worker is 1 (out of 1 available) 12613 1727096146.17155: exiting _queue_task() for managed_node1/package 12613 1727096146.17167: done queuing things up, now waiting for results queue to drain 12613 1727096146.17170: waiting for pending results... 12613 1727096146.17486: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12613 1727096146.17524: in run() - task 0afff68d-5257-a9dd-d073-000000000124 12613 1727096146.17546: variable 'ansible_search_path' from source: unknown 12613 1727096146.17555: variable 'ansible_search_path' from source: unknown 12613 1727096146.17598: calling self._execute() 12613 1727096146.17693: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096146.17706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096146.17720: variable 'omit' from source: magic vars 12613 1727096146.18215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096146.19826: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096146.20155: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096146.20160: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096146.20428: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096146.20432: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096146.20438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096146.20479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096146.20514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096146.20562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096146.20587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096146.20737: variable 'ansible_distribution' from source: facts 12613 1727096146.20751: variable 'ansible_distribution_major_version' from source: facts 12613 1727096146.20779: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096146.20788: when evaluation is False, skipping this task 12613 1727096146.20796: _execute() done 12613 1727096146.20804: dumping result to json 12613 1727096146.20813: done dumping result, returning 12613 1727096146.20826: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-a9dd-d073-000000000124] 12613 1727096146.20836: sending task result for task 0afff68d-5257-a9dd-d073-000000000124 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096146.21006: no more pending results, returning what we have 12613 1727096146.21009: results queue empty 12613 1727096146.21010: checking for any_errors_fatal 12613 1727096146.21023: done checking for any_errors_fatal 12613 1727096146.21024: checking for max_fail_percentage 12613 1727096146.21026: done checking for max_fail_percentage 12613 1727096146.21027: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.21028: done checking to see if all hosts have failed 12613 1727096146.21029: getting the remaining hosts for this loop 12613 1727096146.21030: done getting the remaining hosts for this loop 12613 1727096146.21034: getting the next task for host managed_node1 12613 1727096146.21041: done getting next task for host managed_node1 12613 1727096146.21044: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12613 1727096146.21047: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096146.21069: getting variables 12613 1727096146.21071: in VariableManager get_vars() 12613 1727096146.21120: Calling all_inventory to load vars for managed_node1 12613 1727096146.21239: Calling groups_inventory to load vars for managed_node1 12613 1727096146.21243: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.21255: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.21257: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.21261: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.21272: done sending task result for task 0afff68d-5257-a9dd-d073-000000000124 12613 1727096146.21276: WORKER PROCESS EXITING 12613 1727096146.21485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.21691: done with get_vars() 12613 1727096146.21704: done getting variables 12613 1727096146.21769: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:55:46 -0400 (0:00:00.048) 0:00:09.855 ****** 12613 1727096146.21804: entering _queue_task() for managed_node1/package 12613 1727096146.22099: worker is 1 (out of 1 available) 12613 1727096146.22112: exiting _queue_task() for managed_node1/package 12613 1727096146.22125: done queuing things up, now waiting for results queue to drain 12613 1727096146.22126: waiting for pending results... 12613 1727096146.22414: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12613 1727096146.22555: in run() - task 0afff68d-5257-a9dd-d073-000000000125 12613 1727096146.22579: variable 'ansible_search_path' from source: unknown 12613 1727096146.22595: variable 'ansible_search_path' from source: unknown 12613 1727096146.22639: calling self._execute() 12613 1727096146.22740: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096146.22754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096146.22780: variable 'omit' from source: magic vars 12613 1727096146.23254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096146.25906: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096146.25956: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096146.25984: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096146.26009: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096146.26030: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096146.26094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096146.26115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096146.26134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096146.26161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096146.26174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096146.26283: variable 'ansible_distribution' from source: facts 12613 1727096146.26473: variable 'ansible_distribution_major_version' from source: facts 12613 1727096146.26476: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096146.26479: when evaluation is False, skipping this task 12613 1727096146.26481: _execute() done 12613 1727096146.26484: dumping result to json 12613 1727096146.26486: done dumping result, returning 12613 1727096146.26488: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-a9dd-d073-000000000125] 12613 1727096146.26490: sending task result for task 0afff68d-5257-a9dd-d073-000000000125 12613 1727096146.26558: done sending task result for task 0afff68d-5257-a9dd-d073-000000000125 12613 1727096146.26561: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096146.26630: no more pending results, returning what we have 12613 1727096146.26634: results queue empty 12613 1727096146.26635: checking for any_errors_fatal 12613 1727096146.26640: done checking for any_errors_fatal 12613 1727096146.26641: checking for max_fail_percentage 12613 1727096146.26642: done checking for max_fail_percentage 12613 1727096146.26643: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.26644: done checking to see if all hosts have failed 12613 1727096146.26644: getting the remaining hosts for this loop 12613 1727096146.26646: done getting the remaining hosts for this loop 12613 1727096146.26649: getting the next task for host managed_node1 12613 1727096146.26655: done getting next task for host managed_node1 12613 1727096146.26658: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12613 1727096146.26660: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096146.26679: getting variables 12613 1727096146.26680: in VariableManager get_vars() 12613 1727096146.26724: Calling all_inventory to load vars for managed_node1 12613 1727096146.26726: Calling groups_inventory to load vars for managed_node1 12613 1727096146.26728: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.26736: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.26738: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.26741: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.26967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.27161: done with get_vars() 12613 1727096146.27174: done getting variables 12613 1727096146.27230: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:55:46 -0400 (0:00:00.054) 0:00:09.909 ****** 12613 1727096146.27261: entering _queue_task() for managed_node1/service 12613 1727096146.27616: worker is 1 (out of 1 available) 12613 1727096146.27630: exiting _queue_task() for managed_node1/service 12613 1727096146.27644: done queuing things up, now waiting for results queue to drain 12613 1727096146.27645: waiting for pending results... 12613 1727096146.27841: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12613 1727096146.27928: in run() - task 0afff68d-5257-a9dd-d073-000000000126 12613 1727096146.27940: variable 'ansible_search_path' from source: unknown 12613 1727096146.27944: variable 'ansible_search_path' from source: unknown 12613 1727096146.27978: calling self._execute() 12613 1727096146.28045: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096146.28049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096146.28062: variable 'omit' from source: magic vars 12613 1727096146.28387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096146.30474: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096146.30478: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096146.30483: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096146.30534: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096146.30570: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096146.30655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096146.30691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096146.30720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096146.30765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096146.30789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096146.30938: variable 'ansible_distribution' from source: facts 12613 1727096146.30950: variable 'ansible_distribution_major_version' from source: facts 12613 1727096146.30977: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096146.30985: when evaluation is False, skipping this task 12613 1727096146.30993: _execute() done 12613 1727096146.31000: dumping result to json 12613 1727096146.31008: done dumping result, returning 12613 1727096146.31019: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-a9dd-d073-000000000126] 12613 1727096146.31029: sending task result for task 0afff68d-5257-a9dd-d073-000000000126 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096146.31181: no more pending results, returning what we have 12613 1727096146.31185: results queue empty 12613 1727096146.31186: checking for any_errors_fatal 12613 1727096146.31191: done checking for any_errors_fatal 12613 1727096146.31191: checking for max_fail_percentage 12613 1727096146.31193: done checking for max_fail_percentage 12613 1727096146.31193: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.31194: done checking to see if all hosts have failed 12613 1727096146.31195: getting the remaining hosts for this loop 12613 1727096146.31196: done getting the remaining hosts for this loop 12613 1727096146.31200: getting the next task for host managed_node1 12613 1727096146.31208: done getting next task for host managed_node1 12613 1727096146.31211: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12613 1727096146.31214: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096146.31238: getting variables 12613 1727096146.31239: in VariableManager get_vars() 12613 1727096146.31290: Calling all_inventory to load vars for managed_node1 12613 1727096146.31292: Calling groups_inventory to load vars for managed_node1 12613 1727096146.31294: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.31304: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.31307: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.31309: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.31615: done sending task result for task 0afff68d-5257-a9dd-d073-000000000126 12613 1727096146.31618: WORKER PROCESS EXITING 12613 1727096146.31641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.31858: done with get_vars() 12613 1727096146.31872: done getting variables 12613 1727096146.31933: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:55:46 -0400 (0:00:00.047) 0:00:09.956 ****** 12613 1727096146.31971: entering _queue_task() for managed_node1/service 12613 1727096146.32244: worker is 1 (out of 1 available) 12613 1727096146.32259: exiting _queue_task() for managed_node1/service 12613 1727096146.32272: done queuing things up, now waiting for results queue to drain 12613 1727096146.32274: waiting for pending results... 12613 1727096146.32462: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12613 1727096146.32552: in run() - task 0afff68d-5257-a9dd-d073-000000000127 12613 1727096146.32566: variable 'ansible_search_path' from source: unknown 12613 1727096146.32572: variable 'ansible_search_path' from source: unknown 12613 1727096146.32603: calling self._execute() 12613 1727096146.32673: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096146.32680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096146.32689: variable 'omit' from source: magic vars 12613 1727096146.33007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096146.35230: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096146.35441: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096146.35446: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096146.35449: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096146.35451: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096146.35494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096146.35530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096146.35562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096146.35629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096146.35649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096146.35795: variable 'ansible_distribution' from source: facts 12613 1727096146.35815: variable 'ansible_distribution_major_version' from source: facts 12613 1727096146.35839: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096146.36075: when evaluation is False, skipping this task 12613 1727096146.36079: _execute() done 12613 1727096146.36082: dumping result to json 12613 1727096146.36084: done dumping result, returning 12613 1727096146.36087: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-a9dd-d073-000000000127] 12613 1727096146.36090: sending task result for task 0afff68d-5257-a9dd-d073-000000000127 12613 1727096146.36158: done sending task result for task 0afff68d-5257-a9dd-d073-000000000127 12613 1727096146.36162: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12613 1727096146.36209: no more pending results, returning what we have 12613 1727096146.36213: results queue empty 12613 1727096146.36214: checking for any_errors_fatal 12613 1727096146.36220: done checking for any_errors_fatal 12613 1727096146.36220: checking for max_fail_percentage 12613 1727096146.36222: done checking for max_fail_percentage 12613 1727096146.36223: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.36224: done checking to see if all hosts have failed 12613 1727096146.36224: getting the remaining hosts for this loop 12613 1727096146.36226: done getting the remaining hosts for this loop 12613 1727096146.36230: getting the next task for host managed_node1 12613 1727096146.36235: done getting next task for host managed_node1 12613 1727096146.36239: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12613 1727096146.36241: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096146.36258: getting variables 12613 1727096146.36260: in VariableManager get_vars() 12613 1727096146.36365: Calling all_inventory to load vars for managed_node1 12613 1727096146.36372: Calling groups_inventory to load vars for managed_node1 12613 1727096146.36382: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.36393: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.36396: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.36400: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.36562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.36686: done with get_vars() 12613 1727096146.36694: done getting variables 12613 1727096146.36736: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:55:46 -0400 (0:00:00.047) 0:00:10.004 ****** 12613 1727096146.36760: entering _queue_task() for managed_node1/service 12613 1727096146.36985: worker is 1 (out of 1 available) 12613 1727096146.36998: exiting _queue_task() for managed_node1/service 12613 1727096146.37009: done queuing things up, now waiting for results queue to drain 12613 1727096146.37011: waiting for pending results... 12613 1727096146.37292: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12613 1727096146.37404: in run() - task 0afff68d-5257-a9dd-d073-000000000128 12613 1727096146.37408: variable 'ansible_search_path' from source: unknown 12613 1727096146.37411: variable 'ansible_search_path' from source: unknown 12613 1727096146.37430: calling self._execute() 12613 1727096146.37536: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096146.37548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096146.37566: variable 'omit' from source: magic vars 12613 1727096146.38046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096146.41389: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096146.41464: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096146.41525: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096146.41763: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096146.41791: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096146.41989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096146.42009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096146.42033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096146.42274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096146.42277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096146.42280: variable 'ansible_distribution' from source: facts 12613 1727096146.42282: variable 'ansible_distribution_major_version' from source: facts 12613 1727096146.42285: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096146.42287: when evaluation is False, skipping this task 12613 1727096146.42290: _execute() done 12613 1727096146.42293: dumping result to json 12613 1727096146.42296: done dumping result, returning 12613 1727096146.42301: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-a9dd-d073-000000000128] 12613 1727096146.42303: sending task result for task 0afff68d-5257-a9dd-d073-000000000128 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096146.42464: no more pending results, returning what we have 12613 1727096146.42471: results queue empty 12613 1727096146.42472: checking for any_errors_fatal 12613 1727096146.42479: done checking for any_errors_fatal 12613 1727096146.42479: checking for max_fail_percentage 12613 1727096146.42481: done checking for max_fail_percentage 12613 1727096146.42482: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.42483: done checking to see if all hosts have failed 12613 1727096146.42483: getting the remaining hosts for this loop 12613 1727096146.42485: done getting the remaining hosts for this loop 12613 1727096146.42488: getting the next task for host managed_node1 12613 1727096146.42495: done getting next task for host managed_node1 12613 1727096146.42499: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12613 1727096146.42501: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096146.42520: getting variables 12613 1727096146.42522: in VariableManager get_vars() 12613 1727096146.42578: Calling all_inventory to load vars for managed_node1 12613 1727096146.42581: Calling groups_inventory to load vars for managed_node1 12613 1727096146.42584: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.42594: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.42597: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.42600: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.43081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.43315: done with get_vars() 12613 1727096146.43326: done getting variables 12613 1727096146.43411: done sending task result for task 0afff68d-5257-a9dd-d073-000000000128 12613 1727096146.43414: WORKER PROCESS EXITING 12613 1727096146.43466: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:55:46 -0400 (0:00:00.067) 0:00:10.072 ****** 12613 1727096146.43500: entering _queue_task() for managed_node1/service 12613 1727096146.43792: worker is 1 (out of 1 available) 12613 1727096146.43804: exiting _queue_task() for managed_node1/service 12613 1727096146.43816: done queuing things up, now waiting for results queue to drain 12613 1727096146.43817: waiting for pending results... 12613 1727096146.44096: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 12613 1727096146.44274: in run() - task 0afff68d-5257-a9dd-d073-000000000129 12613 1727096146.44278: variable 'ansible_search_path' from source: unknown 12613 1727096146.44280: variable 'ansible_search_path' from source: unknown 12613 1727096146.44296: calling self._execute() 12613 1727096146.44384: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096146.44403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096146.44418: variable 'omit' from source: magic vars 12613 1727096146.45221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096146.47575: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096146.47615: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096146.47651: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096146.47773: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096146.47778: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096146.47858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096146.47903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096146.47936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096146.48005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096146.48009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096146.48153: variable 'ansible_distribution' from source: facts 12613 1727096146.48222: variable 'ansible_distribution_major_version' from source: facts 12613 1727096146.48226: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096146.48229: when evaluation is False, skipping this task 12613 1727096146.48231: _execute() done 12613 1727096146.48234: dumping result to json 12613 1727096146.48236: done dumping result, returning 12613 1727096146.48239: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-a9dd-d073-000000000129] 12613 1727096146.48241: sending task result for task 0afff68d-5257-a9dd-d073-000000000129 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12613 1727096146.48496: no more pending results, returning what we have 12613 1727096146.48500: results queue empty 12613 1727096146.48501: checking for any_errors_fatal 12613 1727096146.48506: done checking for any_errors_fatal 12613 1727096146.48507: checking for max_fail_percentage 12613 1727096146.48509: done checking for max_fail_percentage 12613 1727096146.48510: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.48511: done checking to see if all hosts have failed 12613 1727096146.48512: getting the remaining hosts for this loop 12613 1727096146.48513: done getting the remaining hosts for this loop 12613 1727096146.48517: getting the next task for host managed_node1 12613 1727096146.48524: done getting next task for host managed_node1 12613 1727096146.48528: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12613 1727096146.48531: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096146.48552: getting variables 12613 1727096146.48554: in VariableManager get_vars() 12613 1727096146.48613: Calling all_inventory to load vars for managed_node1 12613 1727096146.48616: Calling groups_inventory to load vars for managed_node1 12613 1727096146.48619: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.48630: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.48634: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.48637: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.49254: done sending task result for task 0afff68d-5257-a9dd-d073-000000000129 12613 1727096146.49258: WORKER PROCESS EXITING 12613 1727096146.49349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.49555: done with get_vars() 12613 1727096146.49565: done getting variables 12613 1727096146.49633: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:55:46 -0400 (0:00:00.061) 0:00:10.133 ****** 12613 1727096146.49664: entering _queue_task() for managed_node1/copy 12613 1727096146.50076: worker is 1 (out of 1 available) 12613 1727096146.50087: exiting _queue_task() for managed_node1/copy 12613 1727096146.50097: done queuing things up, now waiting for results queue to drain 12613 1727096146.50098: waiting for pending results... 12613 1727096146.50288: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12613 1727096146.50438: in run() - task 0afff68d-5257-a9dd-d073-00000000012a 12613 1727096146.50459: variable 'ansible_search_path' from source: unknown 12613 1727096146.50471: variable 'ansible_search_path' from source: unknown 12613 1727096146.50516: calling self._execute() 12613 1727096146.50618: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096146.50624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096146.50634: variable 'omit' from source: magic vars 12613 1727096146.50963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096146.52775: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096146.52780: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096146.52783: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096146.52821: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096146.52850: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096146.52944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096146.52988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096146.53023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096146.53082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096146.53104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096146.53260: variable 'ansible_distribution' from source: facts 12613 1727096146.53274: variable 'ansible_distribution_major_version' from source: facts 12613 1727096146.53295: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096146.53304: when evaluation is False, skipping this task 12613 1727096146.53311: _execute() done 12613 1727096146.53319: dumping result to json 12613 1727096146.53372: done dumping result, returning 12613 1727096146.53376: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-a9dd-d073-00000000012a] 12613 1727096146.53378: sending task result for task 0afff68d-5257-a9dd-d073-00000000012a skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096146.53513: no more pending results, returning what we have 12613 1727096146.53517: results queue empty 12613 1727096146.53517: checking for any_errors_fatal 12613 1727096146.53524: done checking for any_errors_fatal 12613 1727096146.53524: checking for max_fail_percentage 12613 1727096146.53526: done checking for max_fail_percentage 12613 1727096146.53526: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.53527: done checking to see if all hosts have failed 12613 1727096146.53528: getting the remaining hosts for this loop 12613 1727096146.53529: done getting the remaining hosts for this loop 12613 1727096146.53533: getting the next task for host managed_node1 12613 1727096146.53542: done getting next task for host managed_node1 12613 1727096146.53547: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12613 1727096146.53549: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096146.53571: getting variables 12613 1727096146.53573: in VariableManager get_vars() 12613 1727096146.53622: Calling all_inventory to load vars for managed_node1 12613 1727096146.53625: Calling groups_inventory to load vars for managed_node1 12613 1727096146.53627: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.53636: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.53639: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.53641: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.53805: done sending task result for task 0afff68d-5257-a9dd-d073-00000000012a 12613 1727096146.53808: WORKER PROCESS EXITING 12613 1727096146.53818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.53947: done with get_vars() 12613 1727096146.53958: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:55:46 -0400 (0:00:00.043) 0:00:10.177 ****** 12613 1727096146.54020: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 12613 1727096146.54233: worker is 1 (out of 1 available) 12613 1727096146.54246: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 12613 1727096146.54260: done queuing things up, now waiting for results queue to drain 12613 1727096146.54262: waiting for pending results... 12613 1727096146.54434: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12613 1727096146.54526: in run() - task 0afff68d-5257-a9dd-d073-00000000012b 12613 1727096146.54538: variable 'ansible_search_path' from source: unknown 12613 1727096146.54542: variable 'ansible_search_path' from source: unknown 12613 1727096146.54571: calling self._execute() 12613 1727096146.54640: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096146.54644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096146.54656: variable 'omit' from source: magic vars 12613 1727096146.55033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096146.57131: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096146.57181: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096146.57219: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096146.57245: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096146.57265: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096146.57329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096146.57349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096146.57370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096146.57396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096146.57410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096146.57508: variable 'ansible_distribution' from source: facts 12613 1727096146.57514: variable 'ansible_distribution_major_version' from source: facts 12613 1727096146.57531: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096146.57534: when evaluation is False, skipping this task 12613 1727096146.57537: _execute() done 12613 1727096146.57539: dumping result to json 12613 1727096146.57541: done dumping result, returning 12613 1727096146.57548: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-a9dd-d073-00000000012b] 12613 1727096146.57554: sending task result for task 0afff68d-5257-a9dd-d073-00000000012b 12613 1727096146.57645: done sending task result for task 0afff68d-5257-a9dd-d073-00000000012b 12613 1727096146.57647: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096146.57699: no more pending results, returning what we have 12613 1727096146.57702: results queue empty 12613 1727096146.57703: checking for any_errors_fatal 12613 1727096146.57708: done checking for any_errors_fatal 12613 1727096146.57708: checking for max_fail_percentage 12613 1727096146.57710: done checking for max_fail_percentage 12613 1727096146.57711: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.57712: done checking to see if all hosts have failed 12613 1727096146.57712: getting the remaining hosts for this loop 12613 1727096146.57713: done getting the remaining hosts for this loop 12613 1727096146.57717: getting the next task for host managed_node1 12613 1727096146.57723: done getting next task for host managed_node1 12613 1727096146.57727: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12613 1727096146.57729: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096146.57747: getting variables 12613 1727096146.57749: in VariableManager get_vars() 12613 1727096146.57802: Calling all_inventory to load vars for managed_node1 12613 1727096146.57805: Calling groups_inventory to load vars for managed_node1 12613 1727096146.57807: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.57815: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.57817: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.57820: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.57993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.58117: done with get_vars() 12613 1727096146.58125: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:55:46 -0400 (0:00:00.041) 0:00:10.219 ****** 12613 1727096146.58187: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 12613 1727096146.58406: worker is 1 (out of 1 available) 12613 1727096146.58418: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 12613 1727096146.58431: done queuing things up, now waiting for results queue to drain 12613 1727096146.58432: waiting for pending results... 12613 1727096146.58688: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 12613 1727096146.58974: in run() - task 0afff68d-5257-a9dd-d073-00000000012c 12613 1727096146.58979: variable 'ansible_search_path' from source: unknown 12613 1727096146.58982: variable 'ansible_search_path' from source: unknown 12613 1727096146.58986: calling self._execute() 12613 1727096146.58989: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096146.58992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096146.58995: variable 'omit' from source: magic vars 12613 1727096146.59366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096146.61221: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096146.61268: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096146.61296: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096146.61322: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096146.61341: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096146.61405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096146.61425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096146.61443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096146.61472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096146.61485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096146.61584: variable 'ansible_distribution' from source: facts 12613 1727096146.61588: variable 'ansible_distribution_major_version' from source: facts 12613 1727096146.61606: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096146.61609: when evaluation is False, skipping this task 12613 1727096146.61612: _execute() done 12613 1727096146.61614: dumping result to json 12613 1727096146.61617: done dumping result, returning 12613 1727096146.61621: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-a9dd-d073-00000000012c] 12613 1727096146.61626: sending task result for task 0afff68d-5257-a9dd-d073-00000000012c skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096146.61768: no more pending results, returning what we have 12613 1727096146.61772: results queue empty 12613 1727096146.61773: checking for any_errors_fatal 12613 1727096146.61778: done checking for any_errors_fatal 12613 1727096146.61778: checking for max_fail_percentage 12613 1727096146.61780: done checking for max_fail_percentage 12613 1727096146.61781: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.61782: done checking to see if all hosts have failed 12613 1727096146.61783: getting the remaining hosts for this loop 12613 1727096146.61784: done getting the remaining hosts for this loop 12613 1727096146.61787: getting the next task for host managed_node1 12613 1727096146.61794: done getting next task for host managed_node1 12613 1727096146.61797: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12613 1727096146.61799: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096146.61821: getting variables 12613 1727096146.61822: in VariableManager get_vars() 12613 1727096146.61878: Calling all_inventory to load vars for managed_node1 12613 1727096146.61881: Calling groups_inventory to load vars for managed_node1 12613 1727096146.61883: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.61889: done sending task result for task 0afff68d-5257-a9dd-d073-00000000012c 12613 1727096146.61891: WORKER PROCESS EXITING 12613 1727096146.61900: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.61903: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.61906: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.62037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.62172: done with get_vars() 12613 1727096146.62180: done getting variables 12613 1727096146.62223: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:55:46 -0400 (0:00:00.040) 0:00:10.259 ****** 12613 1727096146.62246: entering _queue_task() for managed_node1/debug 12613 1727096146.62534: worker is 1 (out of 1 available) 12613 1727096146.62545: exiting _queue_task() for managed_node1/debug 12613 1727096146.62559: done queuing things up, now waiting for results queue to drain 12613 1727096146.62560: waiting for pending results... 12613 1727096146.62770: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12613 1727096146.62907: in run() - task 0afff68d-5257-a9dd-d073-00000000012d 12613 1727096146.62928: variable 'ansible_search_path' from source: unknown 12613 1727096146.62935: variable 'ansible_search_path' from source: unknown 12613 1727096146.62983: calling self._execute() 12613 1727096146.63116: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096146.63130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096146.63147: variable 'omit' from source: magic vars 12613 1727096146.63649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096146.65629: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096146.65688: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096146.65715: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096146.65741: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096146.65763: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096146.65831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096146.65851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096146.65871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096146.65904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096146.65913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096146.66016: variable 'ansible_distribution' from source: facts 12613 1727096146.66020: variable 'ansible_distribution_major_version' from source: facts 12613 1727096146.66036: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096146.66038: when evaluation is False, skipping this task 12613 1727096146.66041: _execute() done 12613 1727096146.66044: dumping result to json 12613 1727096146.66046: done dumping result, returning 12613 1727096146.66054: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-a9dd-d073-00000000012d] 12613 1727096146.66061: sending task result for task 0afff68d-5257-a9dd-d073-00000000012d 12613 1727096146.66145: done sending task result for task 0afff68d-5257-a9dd-d073-00000000012d 12613 1727096146.66148: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096146.66190: no more pending results, returning what we have 12613 1727096146.66194: results queue empty 12613 1727096146.66195: checking for any_errors_fatal 12613 1727096146.66201: done checking for any_errors_fatal 12613 1727096146.66202: checking for max_fail_percentage 12613 1727096146.66204: done checking for max_fail_percentage 12613 1727096146.66205: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.66205: done checking to see if all hosts have failed 12613 1727096146.66206: getting the remaining hosts for this loop 12613 1727096146.66207: done getting the remaining hosts for this loop 12613 1727096146.66211: getting the next task for host managed_node1 12613 1727096146.66217: done getting next task for host managed_node1 12613 1727096146.66220: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12613 1727096146.66223: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096146.66242: getting variables 12613 1727096146.66244: in VariableManager get_vars() 12613 1727096146.66347: Calling all_inventory to load vars for managed_node1 12613 1727096146.66350: Calling groups_inventory to load vars for managed_node1 12613 1727096146.66352: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.66360: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.66362: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.66365: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.66486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.66608: done with get_vars() 12613 1727096146.66616: done getting variables 12613 1727096146.66657: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:55:46 -0400 (0:00:00.044) 0:00:10.304 ****** 12613 1727096146.66682: entering _queue_task() for managed_node1/debug 12613 1727096146.66884: worker is 1 (out of 1 available) 12613 1727096146.66897: exiting _queue_task() for managed_node1/debug 12613 1727096146.66909: done queuing things up, now waiting for results queue to drain 12613 1727096146.66910: waiting for pending results... 12613 1727096146.67086: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12613 1727096146.67171: in run() - task 0afff68d-5257-a9dd-d073-00000000012e 12613 1727096146.67183: variable 'ansible_search_path' from source: unknown 12613 1727096146.67186: variable 'ansible_search_path' from source: unknown 12613 1727096146.67214: calling self._execute() 12613 1727096146.67285: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096146.67289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096146.67298: variable 'omit' from source: magic vars 12613 1727096146.67666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096146.69952: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096146.69957: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096146.69960: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096146.69975: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096146.69997: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096146.70061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096146.70089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096146.70106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096146.70135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096146.70147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096146.70248: variable 'ansible_distribution' from source: facts 12613 1727096146.70251: variable 'ansible_distribution_major_version' from source: facts 12613 1727096146.70271: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096146.70274: when evaluation is False, skipping this task 12613 1727096146.70277: _execute() done 12613 1727096146.70279: dumping result to json 12613 1727096146.70282: done dumping result, returning 12613 1727096146.70290: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-a9dd-d073-00000000012e] 12613 1727096146.70292: sending task result for task 0afff68d-5257-a9dd-d073-00000000012e 12613 1727096146.70381: done sending task result for task 0afff68d-5257-a9dd-d073-00000000012e 12613 1727096146.70384: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096146.70426: no more pending results, returning what we have 12613 1727096146.70429: results queue empty 12613 1727096146.70430: checking for any_errors_fatal 12613 1727096146.70436: done checking for any_errors_fatal 12613 1727096146.70437: checking for max_fail_percentage 12613 1727096146.70439: done checking for max_fail_percentage 12613 1727096146.70440: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.70440: done checking to see if all hosts have failed 12613 1727096146.70441: getting the remaining hosts for this loop 12613 1727096146.70442: done getting the remaining hosts for this loop 12613 1727096146.70445: getting the next task for host managed_node1 12613 1727096146.70451: done getting next task for host managed_node1 12613 1727096146.70455: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12613 1727096146.70457: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096146.70477: getting variables 12613 1727096146.70479: in VariableManager get_vars() 12613 1727096146.70527: Calling all_inventory to load vars for managed_node1 12613 1727096146.70530: Calling groups_inventory to load vars for managed_node1 12613 1727096146.70532: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.70541: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.70543: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.70546: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.70693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.70841: done with get_vars() 12613 1727096146.70848: done getting variables 12613 1727096146.70891: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:55:46 -0400 (0:00:00.042) 0:00:10.346 ****** 12613 1727096146.70917: entering _queue_task() for managed_node1/debug 12613 1727096146.71119: worker is 1 (out of 1 available) 12613 1727096146.71131: exiting _queue_task() for managed_node1/debug 12613 1727096146.71142: done queuing things up, now waiting for results queue to drain 12613 1727096146.71144: waiting for pending results... 12613 1727096146.71324: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12613 1727096146.71407: in run() - task 0afff68d-5257-a9dd-d073-00000000012f 12613 1727096146.71418: variable 'ansible_search_path' from source: unknown 12613 1727096146.71422: variable 'ansible_search_path' from source: unknown 12613 1727096146.71450: calling self._execute() 12613 1727096146.71521: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096146.71524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096146.71533: variable 'omit' from source: magic vars 12613 1727096146.72173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096146.73734: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096146.73795: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096146.73820: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096146.73846: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096146.73866: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096146.73930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096146.73951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096146.73971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096146.73997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096146.74012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096146.74106: variable 'ansible_distribution' from source: facts 12613 1727096146.74110: variable 'ansible_distribution_major_version' from source: facts 12613 1727096146.74128: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096146.74131: when evaluation is False, skipping this task 12613 1727096146.74134: _execute() done 12613 1727096146.74136: dumping result to json 12613 1727096146.74138: done dumping result, returning 12613 1727096146.74146: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-a9dd-d073-00000000012f] 12613 1727096146.74150: sending task result for task 0afff68d-5257-a9dd-d073-00000000012f 12613 1727096146.74237: done sending task result for task 0afff68d-5257-a9dd-d073-00000000012f 12613 1727096146.74240: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096146.74297: no more pending results, returning what we have 12613 1727096146.74300: results queue empty 12613 1727096146.74301: checking for any_errors_fatal 12613 1727096146.74308: done checking for any_errors_fatal 12613 1727096146.74309: checking for max_fail_percentage 12613 1727096146.74310: done checking for max_fail_percentage 12613 1727096146.74311: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.74312: done checking to see if all hosts have failed 12613 1727096146.74313: getting the remaining hosts for this loop 12613 1727096146.74314: done getting the remaining hosts for this loop 12613 1727096146.74317: getting the next task for host managed_node1 12613 1727096146.74324: done getting next task for host managed_node1 12613 1727096146.74328: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12613 1727096146.74330: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096146.74353: getting variables 12613 1727096146.74355: in VariableManager get_vars() 12613 1727096146.74402: Calling all_inventory to load vars for managed_node1 12613 1727096146.74405: Calling groups_inventory to load vars for managed_node1 12613 1727096146.74407: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.74414: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.74417: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.74419: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.74553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.74741: done with get_vars() 12613 1727096146.74752: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:55:46 -0400 (0:00:00.039) 0:00:10.385 ****** 12613 1727096146.74848: entering _queue_task() for managed_node1/ping 12613 1727096146.75117: worker is 1 (out of 1 available) 12613 1727096146.75128: exiting _queue_task() for managed_node1/ping 12613 1727096146.75140: done queuing things up, now waiting for results queue to drain 12613 1727096146.75142: waiting for pending results... 12613 1727096146.75493: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 12613 1727096146.75546: in run() - task 0afff68d-5257-a9dd-d073-000000000130 12613 1727096146.75565: variable 'ansible_search_path' from source: unknown 12613 1727096146.75576: variable 'ansible_search_path' from source: unknown 12613 1727096146.75619: calling self._execute() 12613 1727096146.75713: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096146.75724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096146.75738: variable 'omit' from source: magic vars 12613 1727096146.76173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096146.77772: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096146.77898: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096146.77903: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096146.77986: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096146.77990: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096146.78054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096146.78091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096146.78128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096146.78175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096146.78194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096146.78335: variable 'ansible_distribution' from source: facts 12613 1727096146.78441: variable 'ansible_distribution_major_version' from source: facts 12613 1727096146.78445: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096146.78447: when evaluation is False, skipping this task 12613 1727096146.78448: _execute() done 12613 1727096146.78451: dumping result to json 12613 1727096146.78454: done dumping result, returning 12613 1727096146.78456: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-a9dd-d073-000000000130] 12613 1727096146.78458: sending task result for task 0afff68d-5257-a9dd-d073-000000000130 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096146.78571: no more pending results, returning what we have 12613 1727096146.78583: results queue empty 12613 1727096146.78588: checking for any_errors_fatal 12613 1727096146.78594: done checking for any_errors_fatal 12613 1727096146.78595: checking for max_fail_percentage 12613 1727096146.78597: done checking for max_fail_percentage 12613 1727096146.78597: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.78598: done checking to see if all hosts have failed 12613 1727096146.78599: getting the remaining hosts for this loop 12613 1727096146.78600: done getting the remaining hosts for this loop 12613 1727096146.78621: getting the next task for host managed_node1 12613 1727096146.78632: done getting next task for host managed_node1 12613 1727096146.78634: ^ task is: TASK: meta (role_complete) 12613 1727096146.78637: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096146.78648: done sending task result for task 0afff68d-5257-a9dd-d073-000000000130 12613 1727096146.78651: WORKER PROCESS EXITING 12613 1727096146.78742: getting variables 12613 1727096146.78744: in VariableManager get_vars() 12613 1727096146.78795: Calling all_inventory to load vars for managed_node1 12613 1727096146.78798: Calling groups_inventory to load vars for managed_node1 12613 1727096146.78800: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.78809: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.78812: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.78815: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.79094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.79312: done with get_vars() 12613 1727096146.79322: done getting variables 12613 1727096146.79410: done queuing things up, now waiting for results queue to drain 12613 1727096146.79412: results queue empty 12613 1727096146.79413: checking for any_errors_fatal 12613 1727096146.79415: done checking for any_errors_fatal 12613 1727096146.79416: checking for max_fail_percentage 12613 1727096146.79417: done checking for max_fail_percentage 12613 1727096146.79417: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.79418: done checking to see if all hosts have failed 12613 1727096146.79419: getting the remaining hosts for this loop 12613 1727096146.79420: done getting the remaining hosts for this loop 12613 1727096146.79422: getting the next task for host managed_node1 12613 1727096146.79426: done getting next task for host managed_node1 12613 1727096146.79429: ^ task is: TASK: From the active connection, get the controller profile "{{ controller_profile }}" 12613 1727096146.79430: ^ state is: HOST STATE: block=2, task=28, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096146.79432: getting variables 12613 1727096146.79433: in VariableManager get_vars() 12613 1727096146.79455: Calling all_inventory to load vars for managed_node1 12613 1727096146.79458: Calling groups_inventory to load vars for managed_node1 12613 1727096146.79460: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.79465: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.79470: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.79473: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.79650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.79774: done with get_vars() 12613 1727096146.79780: done getting variables 12613 1727096146.79822: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12613 1727096146.79918: variable 'controller_profile' from source: play vars TASK [From the active connection, get the controller profile "bond0"] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:200 Monday 23 September 2024 08:55:46 -0400 (0:00:00.050) 0:00:10.436 ****** 12613 1727096146.79938: entering _queue_task() for managed_node1/command 12613 1727096146.80165: worker is 1 (out of 1 available) 12613 1727096146.80180: exiting _queue_task() for managed_node1/command 12613 1727096146.80191: done queuing things up, now waiting for results queue to drain 12613 1727096146.80193: waiting for pending results... 12613 1727096146.80365: running TaskExecutor() for managed_node1/TASK: From the active connection, get the controller profile "bond0" 12613 1727096146.80425: in run() - task 0afff68d-5257-a9dd-d073-000000000160 12613 1727096146.80442: variable 'ansible_search_path' from source: unknown 12613 1727096146.80475: calling self._execute() 12613 1727096146.80543: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096146.80547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096146.80559: variable 'omit' from source: magic vars 12613 1727096146.80878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096146.83442: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096146.83496: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096146.83521: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096146.83546: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096146.83568: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096146.83631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096146.83654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096146.83671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096146.83697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096146.83711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096146.83805: variable 'ansible_distribution' from source: facts 12613 1727096146.83809: variable 'ansible_distribution_major_version' from source: facts 12613 1727096146.83826: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096146.83830: when evaluation is False, skipping this task 12613 1727096146.83833: _execute() done 12613 1727096146.83835: dumping result to json 12613 1727096146.83837: done dumping result, returning 12613 1727096146.83846: done running TaskExecutor() for managed_node1/TASK: From the active connection, get the controller profile "bond0" [0afff68d-5257-a9dd-d073-000000000160] 12613 1727096146.83850: sending task result for task 0afff68d-5257-a9dd-d073-000000000160 12613 1727096146.83934: done sending task result for task 0afff68d-5257-a9dd-d073-000000000160 12613 1727096146.83936: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096146.83988: no more pending results, returning what we have 12613 1727096146.83992: results queue empty 12613 1727096146.83993: checking for any_errors_fatal 12613 1727096146.83994: done checking for any_errors_fatal 12613 1727096146.83995: checking for max_fail_percentage 12613 1727096146.83997: done checking for max_fail_percentage 12613 1727096146.83997: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.83998: done checking to see if all hosts have failed 12613 1727096146.83999: getting the remaining hosts for this loop 12613 1727096146.84000: done getting the remaining hosts for this loop 12613 1727096146.84004: getting the next task for host managed_node1 12613 1727096146.84010: done getting next task for host managed_node1 12613 1727096146.84012: ^ task is: TASK: Assert that the controller profile is activated 12613 1727096146.84014: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096146.84017: getting variables 12613 1727096146.84018: in VariableManager get_vars() 12613 1727096146.84074: Calling all_inventory to load vars for managed_node1 12613 1727096146.84077: Calling groups_inventory to load vars for managed_node1 12613 1727096146.84079: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.84090: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.84092: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.84095: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.84291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.84493: done with get_vars() 12613 1727096146.84616: done getting variables 12613 1727096146.84685: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the controller profile is activated] ************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:207 Monday 23 September 2024 08:55:46 -0400 (0:00:00.047) 0:00:10.484 ****** 12613 1727096146.84718: entering _queue_task() for managed_node1/assert 12613 1727096146.85018: worker is 1 (out of 1 available) 12613 1727096146.85031: exiting _queue_task() for managed_node1/assert 12613 1727096146.85158: done queuing things up, now waiting for results queue to drain 12613 1727096146.85160: waiting for pending results... 12613 1727096146.85388: running TaskExecutor() for managed_node1/TASK: Assert that the controller profile is activated 12613 1727096146.85484: in run() - task 0afff68d-5257-a9dd-d073-000000000161 12613 1727096146.85488: variable 'ansible_search_path' from source: unknown 12613 1727096146.85516: calling self._execute() 12613 1727096146.85618: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096146.85672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096146.85676: variable 'omit' from source: magic vars 12613 1727096146.86125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096146.88576: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096146.88679: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096146.88725: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096146.88850: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096146.88857: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096146.88977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096146.88980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096146.89029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096146.89194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096146.89198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096146.89285: variable 'ansible_distribution' from source: facts 12613 1727096146.89304: variable 'ansible_distribution_major_version' from source: facts 12613 1727096146.89329: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096146.89337: when evaluation is False, skipping this task 12613 1727096146.89344: _execute() done 12613 1727096146.89351: dumping result to json 12613 1727096146.89363: done dumping result, returning 12613 1727096146.89378: done running TaskExecutor() for managed_node1/TASK: Assert that the controller profile is activated [0afff68d-5257-a9dd-d073-000000000161] 12613 1727096146.89387: sending task result for task 0afff68d-5257-a9dd-d073-000000000161 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096146.89570: no more pending results, returning what we have 12613 1727096146.89574: results queue empty 12613 1727096146.89575: checking for any_errors_fatal 12613 1727096146.89581: done checking for any_errors_fatal 12613 1727096146.89582: checking for max_fail_percentage 12613 1727096146.89584: done checking for max_fail_percentage 12613 1727096146.89584: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.89585: done checking to see if all hosts have failed 12613 1727096146.89586: getting the remaining hosts for this loop 12613 1727096146.89587: done getting the remaining hosts for this loop 12613 1727096146.89597: getting the next task for host managed_node1 12613 1727096146.89603: done getting next task for host managed_node1 12613 1727096146.89605: ^ task is: TASK: Get the controller device details 12613 1727096146.89607: ^ state is: HOST STATE: block=2, task=30, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096146.89611: getting variables 12613 1727096146.89613: in VariableManager get_vars() 12613 1727096146.89781: Calling all_inventory to load vars for managed_node1 12613 1727096146.89785: Calling groups_inventory to load vars for managed_node1 12613 1727096146.89787: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.89805: done sending task result for task 0afff68d-5257-a9dd-d073-000000000161 12613 1727096146.89808: WORKER PROCESS EXITING 12613 1727096146.89819: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.89885: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.89889: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.90348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.90624: done with get_vars() 12613 1727096146.90638: done getting variables 12613 1727096146.90707: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the controller device details] *************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:214 Monday 23 September 2024 08:55:46 -0400 (0:00:00.060) 0:00:10.544 ****** 12613 1727096146.90736: entering _queue_task() for managed_node1/command 12613 1727096146.91036: worker is 1 (out of 1 available) 12613 1727096146.91192: exiting _queue_task() for managed_node1/command 12613 1727096146.91204: done queuing things up, now waiting for results queue to drain 12613 1727096146.91206: waiting for pending results... 12613 1727096146.91392: running TaskExecutor() for managed_node1/TASK: Get the controller device details 12613 1727096146.91461: in run() - task 0afff68d-5257-a9dd-d073-000000000162 12613 1727096146.91474: variable 'ansible_search_path' from source: unknown 12613 1727096146.91506: calling self._execute() 12613 1727096146.91581: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096146.91584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096146.91593: variable 'omit' from source: magic vars 12613 1727096146.91921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096146.93780: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096146.93874: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096146.93950: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096146.93979: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096146.94010: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096146.94081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096146.94108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096146.94127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096146.94155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096146.94171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096146.94275: variable 'ansible_distribution' from source: facts 12613 1727096146.94279: variable 'ansible_distribution_major_version' from source: facts 12613 1727096146.94294: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096146.94297: when evaluation is False, skipping this task 12613 1727096146.94299: _execute() done 12613 1727096146.94301: dumping result to json 12613 1727096146.94304: done dumping result, returning 12613 1727096146.94311: done running TaskExecutor() for managed_node1/TASK: Get the controller device details [0afff68d-5257-a9dd-d073-000000000162] 12613 1727096146.94316: sending task result for task 0afff68d-5257-a9dd-d073-000000000162 12613 1727096146.94402: done sending task result for task 0afff68d-5257-a9dd-d073-000000000162 12613 1727096146.94405: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096146.94458: no more pending results, returning what we have 12613 1727096146.94461: results queue empty 12613 1727096146.94462: checking for any_errors_fatal 12613 1727096146.94470: done checking for any_errors_fatal 12613 1727096146.94471: checking for max_fail_percentage 12613 1727096146.94473: done checking for max_fail_percentage 12613 1727096146.94474: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.94475: done checking to see if all hosts have failed 12613 1727096146.94475: getting the remaining hosts for this loop 12613 1727096146.94476: done getting the remaining hosts for this loop 12613 1727096146.94480: getting the next task for host managed_node1 12613 1727096146.94486: done getting next task for host managed_node1 12613 1727096146.94488: ^ task is: TASK: Assert that the controller profile is activated 12613 1727096146.94490: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096146.94493: getting variables 12613 1727096146.94494: in VariableManager get_vars() 12613 1727096146.94549: Calling all_inventory to load vars for managed_node1 12613 1727096146.94554: Calling groups_inventory to load vars for managed_node1 12613 1727096146.94556: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.94566: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.94575: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.94579: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.94756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.94875: done with get_vars() 12613 1727096146.94884: done getting variables 12613 1727096146.94928: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the controller profile is activated] ************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:221 Monday 23 September 2024 08:55:46 -0400 (0:00:00.042) 0:00:10.586 ****** 12613 1727096146.94948: entering _queue_task() for managed_node1/assert 12613 1727096146.95175: worker is 1 (out of 1 available) 12613 1727096146.95189: exiting _queue_task() for managed_node1/assert 12613 1727096146.95201: done queuing things up, now waiting for results queue to drain 12613 1727096146.95202: waiting for pending results... 12613 1727096146.95370: running TaskExecutor() for managed_node1/TASK: Assert that the controller profile is activated 12613 1727096146.95434: in run() - task 0afff68d-5257-a9dd-d073-000000000163 12613 1727096146.95448: variable 'ansible_search_path' from source: unknown 12613 1727096146.95478: calling self._execute() 12613 1727096146.95548: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096146.95554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096146.95564: variable 'omit' from source: magic vars 12613 1727096146.95880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096146.97993: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096146.98048: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096146.98079: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096146.98105: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096146.98124: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096146.98190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096146.98210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096146.98229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096146.98260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096146.98274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096146.98371: variable 'ansible_distribution' from source: facts 12613 1727096146.98377: variable 'ansible_distribution_major_version' from source: facts 12613 1727096146.98392: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096146.98395: when evaluation is False, skipping this task 12613 1727096146.98397: _execute() done 12613 1727096146.98400: dumping result to json 12613 1727096146.98402: done dumping result, returning 12613 1727096146.98410: done running TaskExecutor() for managed_node1/TASK: Assert that the controller profile is activated [0afff68d-5257-a9dd-d073-000000000163] 12613 1727096146.98414: sending task result for task 0afff68d-5257-a9dd-d073-000000000163 12613 1727096146.98507: done sending task result for task 0afff68d-5257-a9dd-d073-000000000163 12613 1727096146.98509: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096146.98561: no more pending results, returning what we have 12613 1727096146.98564: results queue empty 12613 1727096146.98565: checking for any_errors_fatal 12613 1727096146.98570: done checking for any_errors_fatal 12613 1727096146.98571: checking for max_fail_percentage 12613 1727096146.98573: done checking for max_fail_percentage 12613 1727096146.98573: checking to see if all hosts have failed and the running result is not ok 12613 1727096146.98575: done checking to see if all hosts have failed 12613 1727096146.98575: getting the remaining hosts for this loop 12613 1727096146.98576: done getting the remaining hosts for this loop 12613 1727096146.98580: getting the next task for host managed_node1 12613 1727096146.98589: done getting next task for host managed_node1 12613 1727096146.98594: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12613 1727096146.98598: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096146.98616: getting variables 12613 1727096146.98617: in VariableManager get_vars() 12613 1727096146.98677: Calling all_inventory to load vars for managed_node1 12613 1727096146.98680: Calling groups_inventory to load vars for managed_node1 12613 1727096146.98682: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096146.98690: Calling all_plugins_play to load vars for managed_node1 12613 1727096146.98692: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096146.98695: Calling groups_plugins_play to load vars for managed_node1 12613 1727096146.98831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096146.98963: done with get_vars() 12613 1727096146.98974: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:55:46 -0400 (0:00:00.040) 0:00:10.627 ****** 12613 1727096146.99045: entering _queue_task() for managed_node1/include_tasks 12613 1727096146.99371: worker is 1 (out of 1 available) 12613 1727096146.99384: exiting _queue_task() for managed_node1/include_tasks 12613 1727096146.99395: done queuing things up, now waiting for results queue to drain 12613 1727096146.99397: waiting for pending results... 12613 1727096146.99792: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12613 1727096146.99856: in run() - task 0afff68d-5257-a9dd-d073-00000000016c 12613 1727096146.99875: variable 'ansible_search_path' from source: unknown 12613 1727096146.99879: variable 'ansible_search_path' from source: unknown 12613 1727096146.99936: calling self._execute() 12613 1727096147.00103: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096147.00107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096147.00110: variable 'omit' from source: magic vars 12613 1727096147.00510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096147.02745: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096147.02797: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096147.02826: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096147.02855: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096147.02879: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096147.02942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096147.02970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096147.02988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096147.03014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096147.03025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096147.03128: variable 'ansible_distribution' from source: facts 12613 1727096147.03132: variable 'ansible_distribution_major_version' from source: facts 12613 1727096147.03148: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096147.03151: when evaluation is False, skipping this task 12613 1727096147.03153: _execute() done 12613 1727096147.03160: dumping result to json 12613 1727096147.03164: done dumping result, returning 12613 1727096147.03177: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-a9dd-d073-00000000016c] 12613 1727096147.03180: sending task result for task 0afff68d-5257-a9dd-d073-00000000016c 12613 1727096147.03300: done sending task result for task 0afff68d-5257-a9dd-d073-00000000016c 12613 1727096147.03303: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096147.03394: no more pending results, returning what we have 12613 1727096147.03398: results queue empty 12613 1727096147.03399: checking for any_errors_fatal 12613 1727096147.03406: done checking for any_errors_fatal 12613 1727096147.03407: checking for max_fail_percentage 12613 1727096147.03408: done checking for max_fail_percentage 12613 1727096147.03409: checking to see if all hosts have failed and the running result is not ok 12613 1727096147.03410: done checking to see if all hosts have failed 12613 1727096147.03411: getting the remaining hosts for this loop 12613 1727096147.03412: done getting the remaining hosts for this loop 12613 1727096147.03416: getting the next task for host managed_node1 12613 1727096147.03422: done getting next task for host managed_node1 12613 1727096147.03426: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12613 1727096147.03431: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096147.03448: getting variables 12613 1727096147.03449: in VariableManager get_vars() 12613 1727096147.03497: Calling all_inventory to load vars for managed_node1 12613 1727096147.03500: Calling groups_inventory to load vars for managed_node1 12613 1727096147.03502: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096147.03510: Calling all_plugins_play to load vars for managed_node1 12613 1727096147.03513: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096147.03515: Calling groups_plugins_play to load vars for managed_node1 12613 1727096147.03745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096147.03978: done with get_vars() 12613 1727096147.03997: done getting variables 12613 1727096147.04056: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:55:47 -0400 (0:00:00.050) 0:00:10.678 ****** 12613 1727096147.04103: entering _queue_task() for managed_node1/debug 12613 1727096147.04431: worker is 1 (out of 1 available) 12613 1727096147.04445: exiting _queue_task() for managed_node1/debug 12613 1727096147.04459: done queuing things up, now waiting for results queue to drain 12613 1727096147.04460: waiting for pending results... 12613 1727096147.05092: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 12613 1727096147.05170: in run() - task 0afff68d-5257-a9dd-d073-00000000016d 12613 1727096147.05191: variable 'ansible_search_path' from source: unknown 12613 1727096147.05200: variable 'ansible_search_path' from source: unknown 12613 1727096147.05473: calling self._execute() 12613 1727096147.05478: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096147.05481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096147.05483: variable 'omit' from source: magic vars 12613 1727096147.05906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096147.07741: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096147.07797: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096147.07825: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096147.07856: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096147.07876: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096147.07938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096147.07962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096147.07983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096147.08009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096147.08022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096147.08118: variable 'ansible_distribution' from source: facts 12613 1727096147.08121: variable 'ansible_distribution_major_version' from source: facts 12613 1727096147.08138: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096147.08141: when evaluation is False, skipping this task 12613 1727096147.08144: _execute() done 12613 1727096147.08146: dumping result to json 12613 1727096147.08148: done dumping result, returning 12613 1727096147.08156: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-a9dd-d073-00000000016d] 12613 1727096147.08171: sending task result for task 0afff68d-5257-a9dd-d073-00000000016d 12613 1727096147.08254: done sending task result for task 0afff68d-5257-a9dd-d073-00000000016d 12613 1727096147.08257: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096147.08310: no more pending results, returning what we have 12613 1727096147.08313: results queue empty 12613 1727096147.08314: checking for any_errors_fatal 12613 1727096147.08321: done checking for any_errors_fatal 12613 1727096147.08321: checking for max_fail_percentage 12613 1727096147.08324: done checking for max_fail_percentage 12613 1727096147.08325: checking to see if all hosts have failed and the running result is not ok 12613 1727096147.08325: done checking to see if all hosts have failed 12613 1727096147.08326: getting the remaining hosts for this loop 12613 1727096147.08327: done getting the remaining hosts for this loop 12613 1727096147.08331: getting the next task for host managed_node1 12613 1727096147.08338: done getting next task for host managed_node1 12613 1727096147.08342: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12613 1727096147.08345: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096147.08368: getting variables 12613 1727096147.08370: in VariableManager get_vars() 12613 1727096147.08419: Calling all_inventory to load vars for managed_node1 12613 1727096147.08422: Calling groups_inventory to load vars for managed_node1 12613 1727096147.08424: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096147.08432: Calling all_plugins_play to load vars for managed_node1 12613 1727096147.08435: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096147.08437: Calling groups_plugins_play to load vars for managed_node1 12613 1727096147.08612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096147.08813: done with get_vars() 12613 1727096147.08824: done getting variables 12613 1727096147.08887: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:55:47 -0400 (0:00:00.048) 0:00:10.726 ****** 12613 1727096147.08922: entering _queue_task() for managed_node1/fail 12613 1727096147.09230: worker is 1 (out of 1 available) 12613 1727096147.09243: exiting _queue_task() for managed_node1/fail 12613 1727096147.09256: done queuing things up, now waiting for results queue to drain 12613 1727096147.09257: waiting for pending results... 12613 1727096147.09674: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12613 1727096147.09704: in run() - task 0afff68d-5257-a9dd-d073-00000000016e 12613 1727096147.09708: variable 'ansible_search_path' from source: unknown 12613 1727096147.09711: variable 'ansible_search_path' from source: unknown 12613 1727096147.09739: calling self._execute() 12613 1727096147.09816: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096147.09819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096147.09829: variable 'omit' from source: magic vars 12613 1727096147.10142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096147.11714: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096147.11974: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096147.11977: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096147.11979: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096147.11981: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096147.11984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096147.12040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096147.12076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096147.12132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096147.12155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096147.12319: variable 'ansible_distribution' from source: facts 12613 1727096147.12331: variable 'ansible_distribution_major_version' from source: facts 12613 1727096147.12370: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096147.12379: when evaluation is False, skipping this task 12613 1727096147.12387: _execute() done 12613 1727096147.12393: dumping result to json 12613 1727096147.12399: done dumping result, returning 12613 1727096147.12410: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-a9dd-d073-00000000016e] 12613 1727096147.12419: sending task result for task 0afff68d-5257-a9dd-d073-00000000016e skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096147.12606: no more pending results, returning what we have 12613 1727096147.12610: results queue empty 12613 1727096147.12611: checking for any_errors_fatal 12613 1727096147.12616: done checking for any_errors_fatal 12613 1727096147.12617: checking for max_fail_percentage 12613 1727096147.12619: done checking for max_fail_percentage 12613 1727096147.12619: checking to see if all hosts have failed and the running result is not ok 12613 1727096147.12620: done checking to see if all hosts have failed 12613 1727096147.12621: getting the remaining hosts for this loop 12613 1727096147.12623: done getting the remaining hosts for this loop 12613 1727096147.12627: getting the next task for host managed_node1 12613 1727096147.12635: done getting next task for host managed_node1 12613 1727096147.12639: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12613 1727096147.12642: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096147.12668: getting variables 12613 1727096147.12670: in VariableManager get_vars() 12613 1727096147.12725: Calling all_inventory to load vars for managed_node1 12613 1727096147.12728: Calling groups_inventory to load vars for managed_node1 12613 1727096147.12730: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096147.12741: Calling all_plugins_play to load vars for managed_node1 12613 1727096147.12744: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096147.12909: Calling groups_plugins_play to load vars for managed_node1 12613 1727096147.13163: done sending task result for task 0afff68d-5257-a9dd-d073-00000000016e 12613 1727096147.13166: WORKER PROCESS EXITING 12613 1727096147.13179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096147.13304: done with get_vars() 12613 1727096147.13312: done getting variables 12613 1727096147.13352: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:55:47 -0400 (0:00:00.044) 0:00:10.770 ****** 12613 1727096147.13379: entering _queue_task() for managed_node1/fail 12613 1727096147.13594: worker is 1 (out of 1 available) 12613 1727096147.13608: exiting _queue_task() for managed_node1/fail 12613 1727096147.13620: done queuing things up, now waiting for results queue to drain 12613 1727096147.13621: waiting for pending results... 12613 1727096147.13798: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12613 1727096147.13892: in run() - task 0afff68d-5257-a9dd-d073-00000000016f 12613 1727096147.13902: variable 'ansible_search_path' from source: unknown 12613 1727096147.13906: variable 'ansible_search_path' from source: unknown 12613 1727096147.13934: calling self._execute() 12613 1727096147.14003: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096147.14007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096147.14017: variable 'omit' from source: magic vars 12613 1727096147.14333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096147.16475: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096147.16489: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096147.16532: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096147.16574: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096147.16606: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096147.16703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096147.16724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096147.16749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096147.16776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096147.16787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096147.16895: variable 'ansible_distribution' from source: facts 12613 1727096147.16899: variable 'ansible_distribution_major_version' from source: facts 12613 1727096147.16915: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096147.16918: when evaluation is False, skipping this task 12613 1727096147.16920: _execute() done 12613 1727096147.16922: dumping result to json 12613 1727096147.16926: done dumping result, returning 12613 1727096147.16933: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-a9dd-d073-00000000016f] 12613 1727096147.16938: sending task result for task 0afff68d-5257-a9dd-d073-00000000016f skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096147.17087: no more pending results, returning what we have 12613 1727096147.17091: results queue empty 12613 1727096147.17092: checking for any_errors_fatal 12613 1727096147.17100: done checking for any_errors_fatal 12613 1727096147.17100: checking for max_fail_percentage 12613 1727096147.17102: done checking for max_fail_percentage 12613 1727096147.17103: checking to see if all hosts have failed and the running result is not ok 12613 1727096147.17103: done checking to see if all hosts have failed 12613 1727096147.17104: getting the remaining hosts for this loop 12613 1727096147.17106: done getting the remaining hosts for this loop 12613 1727096147.17109: getting the next task for host managed_node1 12613 1727096147.17116: done getting next task for host managed_node1 12613 1727096147.17120: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12613 1727096147.17123: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096147.17133: done sending task result for task 0afff68d-5257-a9dd-d073-00000000016f 12613 1727096147.17136: WORKER PROCESS EXITING 12613 1727096147.17151: getting variables 12613 1727096147.17155: in VariableManager get_vars() 12613 1727096147.17216: Calling all_inventory to load vars for managed_node1 12613 1727096147.17219: Calling groups_inventory to load vars for managed_node1 12613 1727096147.17221: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096147.17229: Calling all_plugins_play to load vars for managed_node1 12613 1727096147.17231: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096147.17234: Calling groups_plugins_play to load vars for managed_node1 12613 1727096147.17369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096147.17501: done with get_vars() 12613 1727096147.17511: done getting variables 12613 1727096147.17554: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:55:47 -0400 (0:00:00.041) 0:00:10.812 ****** 12613 1727096147.17580: entering _queue_task() for managed_node1/fail 12613 1727096147.17806: worker is 1 (out of 1 available) 12613 1727096147.17819: exiting _queue_task() for managed_node1/fail 12613 1727096147.17832: done queuing things up, now waiting for results queue to drain 12613 1727096147.17834: waiting for pending results... 12613 1727096147.18013: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12613 1727096147.18105: in run() - task 0afff68d-5257-a9dd-d073-000000000170 12613 1727096147.18116: variable 'ansible_search_path' from source: unknown 12613 1727096147.18119: variable 'ansible_search_path' from source: unknown 12613 1727096147.18149: calling self._execute() 12613 1727096147.18220: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096147.18223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096147.18234: variable 'omit' from source: magic vars 12613 1727096147.18771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096147.20538: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096147.20588: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096147.20826: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096147.20858: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096147.20878: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096147.20940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096147.20963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096147.20984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096147.21011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096147.21022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096147.21122: variable 'ansible_distribution' from source: facts 12613 1727096147.21125: variable 'ansible_distribution_major_version' from source: facts 12613 1727096147.21142: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096147.21145: when evaluation is False, skipping this task 12613 1727096147.21147: _execute() done 12613 1727096147.21149: dumping result to json 12613 1727096147.21154: done dumping result, returning 12613 1727096147.21162: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-a9dd-d073-000000000170] 12613 1727096147.21164: sending task result for task 0afff68d-5257-a9dd-d073-000000000170 12613 1727096147.21258: done sending task result for task 0afff68d-5257-a9dd-d073-000000000170 12613 1727096147.21261: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096147.21320: no more pending results, returning what we have 12613 1727096147.21323: results queue empty 12613 1727096147.21324: checking for any_errors_fatal 12613 1727096147.21329: done checking for any_errors_fatal 12613 1727096147.21330: checking for max_fail_percentage 12613 1727096147.21331: done checking for max_fail_percentage 12613 1727096147.21332: checking to see if all hosts have failed and the running result is not ok 12613 1727096147.21333: done checking to see if all hosts have failed 12613 1727096147.21333: getting the remaining hosts for this loop 12613 1727096147.21335: done getting the remaining hosts for this loop 12613 1727096147.21338: getting the next task for host managed_node1 12613 1727096147.21346: done getting next task for host managed_node1 12613 1727096147.21349: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12613 1727096147.21356: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096147.21377: getting variables 12613 1727096147.21379: in VariableManager get_vars() 12613 1727096147.21427: Calling all_inventory to load vars for managed_node1 12613 1727096147.21430: Calling groups_inventory to load vars for managed_node1 12613 1727096147.21432: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096147.21441: Calling all_plugins_play to load vars for managed_node1 12613 1727096147.21443: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096147.21445: Calling groups_plugins_play to load vars for managed_node1 12613 1727096147.21624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096147.21754: done with get_vars() 12613 1727096147.21763: done getting variables 12613 1727096147.21809: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:55:47 -0400 (0:00:00.042) 0:00:10.855 ****** 12613 1727096147.21834: entering _queue_task() for managed_node1/dnf 12613 1727096147.22065: worker is 1 (out of 1 available) 12613 1727096147.22082: exiting _queue_task() for managed_node1/dnf 12613 1727096147.22094: done queuing things up, now waiting for results queue to drain 12613 1727096147.22095: waiting for pending results... 12613 1727096147.22271: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12613 1727096147.22362: in run() - task 0afff68d-5257-a9dd-d073-000000000171 12613 1727096147.22375: variable 'ansible_search_path' from source: unknown 12613 1727096147.22379: variable 'ansible_search_path' from source: unknown 12613 1727096147.22407: calling self._execute() 12613 1727096147.22477: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096147.22481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096147.22490: variable 'omit' from source: magic vars 12613 1727096147.22803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096147.24572: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096147.24619: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096147.24644: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096147.24674: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096147.24692: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096147.24756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096147.24778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096147.24796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096147.24822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096147.24834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096147.24937: variable 'ansible_distribution' from source: facts 12613 1727096147.24949: variable 'ansible_distribution_major_version' from source: facts 12613 1727096147.24963: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096147.24966: when evaluation is False, skipping this task 12613 1727096147.24984: _execute() done 12613 1727096147.24986: dumping result to json 12613 1727096147.24989: done dumping result, returning 12613 1727096147.24991: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-a9dd-d073-000000000171] 12613 1727096147.24994: sending task result for task 0afff68d-5257-a9dd-d073-000000000171 12613 1727096147.25080: done sending task result for task 0afff68d-5257-a9dd-d073-000000000171 12613 1727096147.25083: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096147.25131: no more pending results, returning what we have 12613 1727096147.25134: results queue empty 12613 1727096147.25135: checking for any_errors_fatal 12613 1727096147.25141: done checking for any_errors_fatal 12613 1727096147.25142: checking for max_fail_percentage 12613 1727096147.25144: done checking for max_fail_percentage 12613 1727096147.25145: checking to see if all hosts have failed and the running result is not ok 12613 1727096147.25145: done checking to see if all hosts have failed 12613 1727096147.25146: getting the remaining hosts for this loop 12613 1727096147.25147: done getting the remaining hosts for this loop 12613 1727096147.25151: getting the next task for host managed_node1 12613 1727096147.25159: done getting next task for host managed_node1 12613 1727096147.25162: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12613 1727096147.25166: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096147.25186: getting variables 12613 1727096147.25188: in VariableManager get_vars() 12613 1727096147.25238: Calling all_inventory to load vars for managed_node1 12613 1727096147.25241: Calling groups_inventory to load vars for managed_node1 12613 1727096147.25243: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096147.25253: Calling all_plugins_play to load vars for managed_node1 12613 1727096147.25255: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096147.25258: Calling groups_plugins_play to load vars for managed_node1 12613 1727096147.25437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096147.25657: done with get_vars() 12613 1727096147.25672: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12613 1727096147.25757: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:55:47 -0400 (0:00:00.039) 0:00:10.895 ****** 12613 1727096147.25795: entering _queue_task() for managed_node1/yum 12613 1727096147.26135: worker is 1 (out of 1 available) 12613 1727096147.26155: exiting _queue_task() for managed_node1/yum 12613 1727096147.26171: done queuing things up, now waiting for results queue to drain 12613 1727096147.26173: waiting for pending results... 12613 1727096147.26874: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12613 1727096147.27174: in run() - task 0afff68d-5257-a9dd-d073-000000000172 12613 1727096147.27178: variable 'ansible_search_path' from source: unknown 12613 1727096147.27181: variable 'ansible_search_path' from source: unknown 12613 1727096147.27183: calling self._execute() 12613 1727096147.27185: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096147.27187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096147.27776: variable 'omit' from source: magic vars 12613 1727096147.28185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096147.31717: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096147.31793: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096147.31840: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096147.31899: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096147.31932: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096147.32020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096147.32058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096147.32096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096147.32143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096147.32163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096147.32311: variable 'ansible_distribution' from source: facts 12613 1727096147.32321: variable 'ansible_distribution_major_version' from source: facts 12613 1727096147.32342: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096147.32348: when evaluation is False, skipping this task 12613 1727096147.32358: _execute() done 12613 1727096147.32365: dumping result to json 12613 1727096147.32374: done dumping result, returning 12613 1727096147.32384: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-a9dd-d073-000000000172] 12613 1727096147.32393: sending task result for task 0afff68d-5257-a9dd-d073-000000000172 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096147.32622: no more pending results, returning what we have 12613 1727096147.32627: results queue empty 12613 1727096147.32629: checking for any_errors_fatal 12613 1727096147.32635: done checking for any_errors_fatal 12613 1727096147.32636: checking for max_fail_percentage 12613 1727096147.32638: done checking for max_fail_percentage 12613 1727096147.32639: checking to see if all hosts have failed and the running result is not ok 12613 1727096147.32639: done checking to see if all hosts have failed 12613 1727096147.32640: getting the remaining hosts for this loop 12613 1727096147.32642: done getting the remaining hosts for this loop 12613 1727096147.32646: getting the next task for host managed_node1 12613 1727096147.32653: done getting next task for host managed_node1 12613 1727096147.32657: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12613 1727096147.32661: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096147.32744: getting variables 12613 1727096147.32746: in VariableManager get_vars() 12613 1727096147.32978: Calling all_inventory to load vars for managed_node1 12613 1727096147.32981: Calling groups_inventory to load vars for managed_node1 12613 1727096147.32983: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096147.33060: done sending task result for task 0afff68d-5257-a9dd-d073-000000000172 12613 1727096147.33063: WORKER PROCESS EXITING 12613 1727096147.33073: Calling all_plugins_play to load vars for managed_node1 12613 1727096147.33076: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096147.33079: Calling groups_plugins_play to load vars for managed_node1 12613 1727096147.33263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096147.33481: done with get_vars() 12613 1727096147.33496: done getting variables 12613 1727096147.33551: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:55:47 -0400 (0:00:00.077) 0:00:10.973 ****** 12613 1727096147.33586: entering _queue_task() for managed_node1/fail 12613 1727096147.33882: worker is 1 (out of 1 available) 12613 1727096147.33895: exiting _queue_task() for managed_node1/fail 12613 1727096147.33907: done queuing things up, now waiting for results queue to drain 12613 1727096147.33909: waiting for pending results... 12613 1727096147.34173: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12613 1727096147.34332: in run() - task 0afff68d-5257-a9dd-d073-000000000173 12613 1727096147.34356: variable 'ansible_search_path' from source: unknown 12613 1727096147.34375: variable 'ansible_search_path' from source: unknown 12613 1727096147.34418: calling self._execute() 12613 1727096147.34521: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096147.34533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096147.34584: variable 'omit' from source: magic vars 12613 1727096147.35024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096147.37732: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096147.37796: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096147.37836: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096147.37947: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096147.37951: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096147.37971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096147.38052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096147.38056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096147.38063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096147.38095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096147.38225: variable 'ansible_distribution' from source: facts 12613 1727096147.38232: variable 'ansible_distribution_major_version' from source: facts 12613 1727096147.38251: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096147.38254: when evaluation is False, skipping this task 12613 1727096147.38259: _execute() done 12613 1727096147.38262: dumping result to json 12613 1727096147.38271: done dumping result, returning 12613 1727096147.38302: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-a9dd-d073-000000000173] 12613 1727096147.38309: sending task result for task 0afff68d-5257-a9dd-d073-000000000173 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096147.38507: no more pending results, returning what we have 12613 1727096147.38511: results queue empty 12613 1727096147.38512: checking for any_errors_fatal 12613 1727096147.38525: done checking for any_errors_fatal 12613 1727096147.38526: checking for max_fail_percentage 12613 1727096147.38528: done checking for max_fail_percentage 12613 1727096147.38528: checking to see if all hosts have failed and the running result is not ok 12613 1727096147.38529: done checking to see if all hosts have failed 12613 1727096147.38530: getting the remaining hosts for this loop 12613 1727096147.38532: done getting the remaining hosts for this loop 12613 1727096147.38536: getting the next task for host managed_node1 12613 1727096147.38543: done getting next task for host managed_node1 12613 1727096147.38547: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12613 1727096147.38551: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096147.38571: getting variables 12613 1727096147.38573: in VariableManager get_vars() 12613 1727096147.38618: Calling all_inventory to load vars for managed_node1 12613 1727096147.38620: Calling groups_inventory to load vars for managed_node1 12613 1727096147.38622: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096147.38631: Calling all_plugins_play to load vars for managed_node1 12613 1727096147.38634: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096147.38636: Calling groups_plugins_play to load vars for managed_node1 12613 1727096147.38783: done sending task result for task 0afff68d-5257-a9dd-d073-000000000173 12613 1727096147.38819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096147.39057: done with get_vars() 12613 1727096147.39070: done getting variables 12613 1727096147.39102: WORKER PROCESS EXITING 12613 1727096147.39144: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:55:47 -0400 (0:00:00.055) 0:00:11.028 ****** 12613 1727096147.39180: entering _queue_task() for managed_node1/package 12613 1727096147.39599: worker is 1 (out of 1 available) 12613 1727096147.39610: exiting _queue_task() for managed_node1/package 12613 1727096147.39621: done queuing things up, now waiting for results queue to drain 12613 1727096147.39622: waiting for pending results... 12613 1727096147.39790: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 12613 1727096147.39957: in run() - task 0afff68d-5257-a9dd-d073-000000000174 12613 1727096147.39972: variable 'ansible_search_path' from source: unknown 12613 1727096147.39975: variable 'ansible_search_path' from source: unknown 12613 1727096147.40005: calling self._execute() 12613 1727096147.40080: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096147.40084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096147.40094: variable 'omit' from source: magic vars 12613 1727096147.40673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096147.43434: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096147.43518: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096147.43581: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096147.43625: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096147.43659: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096147.43755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096147.43780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096147.43798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096147.43832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096147.43843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096147.43955: variable 'ansible_distribution' from source: facts 12613 1727096147.43963: variable 'ansible_distribution_major_version' from source: facts 12613 1727096147.43979: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096147.43982: when evaluation is False, skipping this task 12613 1727096147.43985: _execute() done 12613 1727096147.43987: dumping result to json 12613 1727096147.43989: done dumping result, returning 12613 1727096147.43997: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-a9dd-d073-000000000174] 12613 1727096147.44001: sending task result for task 0afff68d-5257-a9dd-d073-000000000174 12613 1727096147.44097: done sending task result for task 0afff68d-5257-a9dd-d073-000000000174 12613 1727096147.44099: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096147.44175: no more pending results, returning what we have 12613 1727096147.44178: results queue empty 12613 1727096147.44179: checking for any_errors_fatal 12613 1727096147.44186: done checking for any_errors_fatal 12613 1727096147.44186: checking for max_fail_percentage 12613 1727096147.44188: done checking for max_fail_percentage 12613 1727096147.44188: checking to see if all hosts have failed and the running result is not ok 12613 1727096147.44189: done checking to see if all hosts have failed 12613 1727096147.44190: getting the remaining hosts for this loop 12613 1727096147.44191: done getting the remaining hosts for this loop 12613 1727096147.44194: getting the next task for host managed_node1 12613 1727096147.44201: done getting next task for host managed_node1 12613 1727096147.44205: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12613 1727096147.44209: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096147.44228: getting variables 12613 1727096147.44230: in VariableManager get_vars() 12613 1727096147.44276: Calling all_inventory to load vars for managed_node1 12613 1727096147.44279: Calling groups_inventory to load vars for managed_node1 12613 1727096147.44281: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096147.44289: Calling all_plugins_play to load vars for managed_node1 12613 1727096147.44291: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096147.44294: Calling groups_plugins_play to load vars for managed_node1 12613 1727096147.44421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096147.44544: done with get_vars() 12613 1727096147.44553: done getting variables 12613 1727096147.44597: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:55:47 -0400 (0:00:00.054) 0:00:11.083 ****** 12613 1727096147.44621: entering _queue_task() for managed_node1/package 12613 1727096147.44830: worker is 1 (out of 1 available) 12613 1727096147.44844: exiting _queue_task() for managed_node1/package 12613 1727096147.44854: done queuing things up, now waiting for results queue to drain 12613 1727096147.44855: waiting for pending results... 12613 1727096147.45185: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12613 1727096147.45230: in run() - task 0afff68d-5257-a9dd-d073-000000000175 12613 1727096147.45251: variable 'ansible_search_path' from source: unknown 12613 1727096147.45259: variable 'ansible_search_path' from source: unknown 12613 1727096147.45307: calling self._execute() 12613 1727096147.45415: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096147.45507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096147.45583: variable 'omit' from source: magic vars 12613 1727096147.46392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096147.48775: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096147.48780: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096147.48809: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096147.48851: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096147.48886: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096147.48979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096147.49015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096147.49046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096147.49090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096147.49109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096147.49242: variable 'ansible_distribution' from source: facts 12613 1727096147.49255: variable 'ansible_distribution_major_version' from source: facts 12613 1727096147.49282: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096147.49374: when evaluation is False, skipping this task 12613 1727096147.49378: _execute() done 12613 1727096147.49382: dumping result to json 12613 1727096147.49385: done dumping result, returning 12613 1727096147.49388: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-a9dd-d073-000000000175] 12613 1727096147.49390: sending task result for task 0afff68d-5257-a9dd-d073-000000000175 12613 1727096147.49473: done sending task result for task 0afff68d-5257-a9dd-d073-000000000175 12613 1727096147.49477: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096147.49528: no more pending results, returning what we have 12613 1727096147.49532: results queue empty 12613 1727096147.49533: checking for any_errors_fatal 12613 1727096147.49537: done checking for any_errors_fatal 12613 1727096147.49538: checking for max_fail_percentage 12613 1727096147.49540: done checking for max_fail_percentage 12613 1727096147.49541: checking to see if all hosts have failed and the running result is not ok 12613 1727096147.49541: done checking to see if all hosts have failed 12613 1727096147.49542: getting the remaining hosts for this loop 12613 1727096147.49543: done getting the remaining hosts for this loop 12613 1727096147.49546: getting the next task for host managed_node1 12613 1727096147.49556: done getting next task for host managed_node1 12613 1727096147.49560: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12613 1727096147.49564: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096147.49585: getting variables 12613 1727096147.49587: in VariableManager get_vars() 12613 1727096147.49636: Calling all_inventory to load vars for managed_node1 12613 1727096147.49638: Calling groups_inventory to load vars for managed_node1 12613 1727096147.49640: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096147.49650: Calling all_plugins_play to load vars for managed_node1 12613 1727096147.49654: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096147.49657: Calling groups_plugins_play to load vars for managed_node1 12613 1727096147.50329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096147.50550: done with get_vars() 12613 1727096147.50563: done getting variables 12613 1727096147.50629: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:55:47 -0400 (0:00:00.060) 0:00:11.143 ****** 12613 1727096147.50665: entering _queue_task() for managed_node1/package 12613 1727096147.50962: worker is 1 (out of 1 available) 12613 1727096147.50976: exiting _queue_task() for managed_node1/package 12613 1727096147.50986: done queuing things up, now waiting for results queue to drain 12613 1727096147.50988: waiting for pending results... 12613 1727096147.51171: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12613 1727096147.51268: in run() - task 0afff68d-5257-a9dd-d073-000000000176 12613 1727096147.51280: variable 'ansible_search_path' from source: unknown 12613 1727096147.51283: variable 'ansible_search_path' from source: unknown 12613 1727096147.51312: calling self._execute() 12613 1727096147.51386: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096147.51393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096147.51402: variable 'omit' from source: magic vars 12613 1727096147.51830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096147.53946: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096147.54005: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096147.54033: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096147.54065: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096147.54087: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096147.54148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096147.54176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096147.54193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096147.54219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096147.54231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096147.54335: variable 'ansible_distribution' from source: facts 12613 1727096147.54338: variable 'ansible_distribution_major_version' from source: facts 12613 1727096147.54354: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096147.54359: when evaluation is False, skipping this task 12613 1727096147.54362: _execute() done 12613 1727096147.54364: dumping result to json 12613 1727096147.54370: done dumping result, returning 12613 1727096147.54377: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-a9dd-d073-000000000176] 12613 1727096147.54381: sending task result for task 0afff68d-5257-a9dd-d073-000000000176 12613 1727096147.54474: done sending task result for task 0afff68d-5257-a9dd-d073-000000000176 12613 1727096147.54477: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096147.54536: no more pending results, returning what we have 12613 1727096147.54539: results queue empty 12613 1727096147.54540: checking for any_errors_fatal 12613 1727096147.54548: done checking for any_errors_fatal 12613 1727096147.54548: checking for max_fail_percentage 12613 1727096147.54550: done checking for max_fail_percentage 12613 1727096147.54551: checking to see if all hosts have failed and the running result is not ok 12613 1727096147.54551: done checking to see if all hosts have failed 12613 1727096147.54552: getting the remaining hosts for this loop 12613 1727096147.54553: done getting the remaining hosts for this loop 12613 1727096147.54557: getting the next task for host managed_node1 12613 1727096147.54563: done getting next task for host managed_node1 12613 1727096147.54567: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12613 1727096147.54572: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096147.54593: getting variables 12613 1727096147.54595: in VariableManager get_vars() 12613 1727096147.54644: Calling all_inventory to load vars for managed_node1 12613 1727096147.54646: Calling groups_inventory to load vars for managed_node1 12613 1727096147.54648: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096147.54656: Calling all_plugins_play to load vars for managed_node1 12613 1727096147.54659: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096147.54661: Calling groups_plugins_play to load vars for managed_node1 12613 1727096147.54800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096147.54931: done with get_vars() 12613 1727096147.54941: done getting variables 12613 1727096147.54984: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:55:47 -0400 (0:00:00.043) 0:00:11.187 ****** 12613 1727096147.55011: entering _queue_task() for managed_node1/service 12613 1727096147.55238: worker is 1 (out of 1 available) 12613 1727096147.55251: exiting _queue_task() for managed_node1/service 12613 1727096147.55262: done queuing things up, now waiting for results queue to drain 12613 1727096147.55263: waiting for pending results... 12613 1727096147.55576: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12613 1727096147.55623: in run() - task 0afff68d-5257-a9dd-d073-000000000177 12613 1727096147.55659: variable 'ansible_search_path' from source: unknown 12613 1727096147.55663: variable 'ansible_search_path' from source: unknown 12613 1727096147.55763: calling self._execute() 12613 1727096147.55805: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096147.55816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096147.55832: variable 'omit' from source: magic vars 12613 1727096147.56293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096147.57963: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096147.58013: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096147.58040: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096147.58071: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096147.58091: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096147.58150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096147.58175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096147.58195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096147.58220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096147.58231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096147.58330: variable 'ansible_distribution' from source: facts 12613 1727096147.58333: variable 'ansible_distribution_major_version' from source: facts 12613 1727096147.58349: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096147.58354: when evaluation is False, skipping this task 12613 1727096147.58357: _execute() done 12613 1727096147.58359: dumping result to json 12613 1727096147.58361: done dumping result, returning 12613 1727096147.58364: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-a9dd-d073-000000000177] 12613 1727096147.58371: sending task result for task 0afff68d-5257-a9dd-d073-000000000177 12613 1727096147.58462: done sending task result for task 0afff68d-5257-a9dd-d073-000000000177 12613 1727096147.58465: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096147.58527: no more pending results, returning what we have 12613 1727096147.58531: results queue empty 12613 1727096147.58531: checking for any_errors_fatal 12613 1727096147.58536: done checking for any_errors_fatal 12613 1727096147.58537: checking for max_fail_percentage 12613 1727096147.58539: done checking for max_fail_percentage 12613 1727096147.58540: checking to see if all hosts have failed and the running result is not ok 12613 1727096147.58540: done checking to see if all hosts have failed 12613 1727096147.58541: getting the remaining hosts for this loop 12613 1727096147.58542: done getting the remaining hosts for this loop 12613 1727096147.58546: getting the next task for host managed_node1 12613 1727096147.58555: done getting next task for host managed_node1 12613 1727096147.58559: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12613 1727096147.58563: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096147.58584: getting variables 12613 1727096147.58586: in VariableManager get_vars() 12613 1727096147.58634: Calling all_inventory to load vars for managed_node1 12613 1727096147.58636: Calling groups_inventory to load vars for managed_node1 12613 1727096147.58638: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096147.58647: Calling all_plugins_play to load vars for managed_node1 12613 1727096147.58649: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096147.58651: Calling groups_plugins_play to load vars for managed_node1 12613 1727096147.58834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096147.58961: done with get_vars() 12613 1727096147.58972: done getting variables 12613 1727096147.59016: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:55:47 -0400 (0:00:00.040) 0:00:11.227 ****** 12613 1727096147.59040: entering _queue_task() for managed_node1/service 12613 1727096147.59264: worker is 1 (out of 1 available) 12613 1727096147.59280: exiting _queue_task() for managed_node1/service 12613 1727096147.59291: done queuing things up, now waiting for results queue to drain 12613 1727096147.59292: waiting for pending results... 12613 1727096147.59465: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12613 1727096147.59560: in run() - task 0afff68d-5257-a9dd-d073-000000000178 12613 1727096147.59573: variable 'ansible_search_path' from source: unknown 12613 1727096147.59576: variable 'ansible_search_path' from source: unknown 12613 1727096147.59605: calling self._execute() 12613 1727096147.59674: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096147.59678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096147.59688: variable 'omit' from source: magic vars 12613 1727096147.59997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096147.61527: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096147.61585: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096147.61611: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096147.61637: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096147.61658: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096147.61723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096147.61747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096147.61764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096147.61791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096147.61803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096147.61900: variable 'ansible_distribution' from source: facts 12613 1727096147.61905: variable 'ansible_distribution_major_version' from source: facts 12613 1727096147.61923: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096147.61926: when evaluation is False, skipping this task 12613 1727096147.61929: _execute() done 12613 1727096147.61931: dumping result to json 12613 1727096147.61933: done dumping result, returning 12613 1727096147.61939: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-a9dd-d073-000000000178] 12613 1727096147.61944: sending task result for task 0afff68d-5257-a9dd-d073-000000000178 12613 1727096147.62035: done sending task result for task 0afff68d-5257-a9dd-d073-000000000178 12613 1727096147.62037: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12613 1727096147.62083: no more pending results, returning what we have 12613 1727096147.62087: results queue empty 12613 1727096147.62088: checking for any_errors_fatal 12613 1727096147.62092: done checking for any_errors_fatal 12613 1727096147.62092: checking for max_fail_percentage 12613 1727096147.62094: done checking for max_fail_percentage 12613 1727096147.62095: checking to see if all hosts have failed and the running result is not ok 12613 1727096147.62096: done checking to see if all hosts have failed 12613 1727096147.62096: getting the remaining hosts for this loop 12613 1727096147.62097: done getting the remaining hosts for this loop 12613 1727096147.62101: getting the next task for host managed_node1 12613 1727096147.62108: done getting next task for host managed_node1 12613 1727096147.62111: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12613 1727096147.62115: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096147.62133: getting variables 12613 1727096147.62135: in VariableManager get_vars() 12613 1727096147.62187: Calling all_inventory to load vars for managed_node1 12613 1727096147.62189: Calling groups_inventory to load vars for managed_node1 12613 1727096147.62191: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096147.62200: Calling all_plugins_play to load vars for managed_node1 12613 1727096147.62202: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096147.62205: Calling groups_plugins_play to load vars for managed_node1 12613 1727096147.62345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096147.62483: done with get_vars() 12613 1727096147.62494: done getting variables 12613 1727096147.62535: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:55:47 -0400 (0:00:00.035) 0:00:11.262 ****** 12613 1727096147.62562: entering _queue_task() for managed_node1/service 12613 1727096147.62777: worker is 1 (out of 1 available) 12613 1727096147.62790: exiting _queue_task() for managed_node1/service 12613 1727096147.62802: done queuing things up, now waiting for results queue to drain 12613 1727096147.62803: waiting for pending results... 12613 1727096147.62975: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12613 1727096147.63065: in run() - task 0afff68d-5257-a9dd-d073-000000000179 12613 1727096147.63078: variable 'ansible_search_path' from source: unknown 12613 1727096147.63082: variable 'ansible_search_path' from source: unknown 12613 1727096147.63110: calling self._execute() 12613 1727096147.63178: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096147.63182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096147.63192: variable 'omit' from source: magic vars 12613 1727096147.63504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096147.65059: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096147.65107: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096147.65136: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096147.65162: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096147.65183: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096147.65245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096147.65268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096147.65287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096147.65315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096147.65327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096147.65425: variable 'ansible_distribution' from source: facts 12613 1727096147.65431: variable 'ansible_distribution_major_version' from source: facts 12613 1727096147.65446: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096147.65449: when evaluation is False, skipping this task 12613 1727096147.65454: _execute() done 12613 1727096147.65482: dumping result to json 12613 1727096147.65485: done dumping result, returning 12613 1727096147.65487: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-a9dd-d073-000000000179] 12613 1727096147.65489: sending task result for task 0afff68d-5257-a9dd-d073-000000000179 12613 1727096147.65571: done sending task result for task 0afff68d-5257-a9dd-d073-000000000179 12613 1727096147.65574: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096147.65616: no more pending results, returning what we have 12613 1727096147.65619: results queue empty 12613 1727096147.65620: checking for any_errors_fatal 12613 1727096147.65625: done checking for any_errors_fatal 12613 1727096147.65626: checking for max_fail_percentage 12613 1727096147.65628: done checking for max_fail_percentage 12613 1727096147.65629: checking to see if all hosts have failed and the running result is not ok 12613 1727096147.65629: done checking to see if all hosts have failed 12613 1727096147.65630: getting the remaining hosts for this loop 12613 1727096147.65631: done getting the remaining hosts for this loop 12613 1727096147.65635: getting the next task for host managed_node1 12613 1727096147.65641: done getting next task for host managed_node1 12613 1727096147.65645: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12613 1727096147.65649: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096147.65669: getting variables 12613 1727096147.65671: in VariableManager get_vars() 12613 1727096147.65721: Calling all_inventory to load vars for managed_node1 12613 1727096147.65724: Calling groups_inventory to load vars for managed_node1 12613 1727096147.65726: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096147.65735: Calling all_plugins_play to load vars for managed_node1 12613 1727096147.65738: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096147.65740: Calling groups_plugins_play to load vars for managed_node1 12613 1727096147.65934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096147.66068: done with get_vars() 12613 1727096147.66077: done getting variables 12613 1727096147.66117: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:55:47 -0400 (0:00:00.035) 0:00:11.298 ****** 12613 1727096147.66142: entering _queue_task() for managed_node1/service 12613 1727096147.66357: worker is 1 (out of 1 available) 12613 1727096147.66372: exiting _queue_task() for managed_node1/service 12613 1727096147.66383: done queuing things up, now waiting for results queue to drain 12613 1727096147.66385: waiting for pending results... 12613 1727096147.66562: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 12613 1727096147.66652: in run() - task 0afff68d-5257-a9dd-d073-00000000017a 12613 1727096147.66665: variable 'ansible_search_path' from source: unknown 12613 1727096147.66670: variable 'ansible_search_path' from source: unknown 12613 1727096147.66699: calling self._execute() 12613 1727096147.66772: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096147.66776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096147.66787: variable 'omit' from source: magic vars 12613 1727096147.67103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096147.68655: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096147.68713: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096147.68740: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096147.68770: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096147.68793: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096147.68851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096147.68876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096147.68901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096147.68924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096147.68935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096147.69039: variable 'ansible_distribution' from source: facts 12613 1727096147.69042: variable 'ansible_distribution_major_version' from source: facts 12613 1727096147.69061: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096147.69065: when evaluation is False, skipping this task 12613 1727096147.69069: _execute() done 12613 1727096147.69072: dumping result to json 12613 1727096147.69074: done dumping result, returning 12613 1727096147.69080: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-a9dd-d073-00000000017a] 12613 1727096147.69085: sending task result for task 0afff68d-5257-a9dd-d073-00000000017a 12613 1727096147.69180: done sending task result for task 0afff68d-5257-a9dd-d073-00000000017a 12613 1727096147.69183: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12613 1727096147.69272: no more pending results, returning what we have 12613 1727096147.69276: results queue empty 12613 1727096147.69277: checking for any_errors_fatal 12613 1727096147.69282: done checking for any_errors_fatal 12613 1727096147.69283: checking for max_fail_percentage 12613 1727096147.69285: done checking for max_fail_percentage 12613 1727096147.69285: checking to see if all hosts have failed and the running result is not ok 12613 1727096147.69286: done checking to see if all hosts have failed 12613 1727096147.69287: getting the remaining hosts for this loop 12613 1727096147.69288: done getting the remaining hosts for this loop 12613 1727096147.69298: getting the next task for host managed_node1 12613 1727096147.69304: done getting next task for host managed_node1 12613 1727096147.69308: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12613 1727096147.69312: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096147.69329: getting variables 12613 1727096147.69331: in VariableManager get_vars() 12613 1727096147.69378: Calling all_inventory to load vars for managed_node1 12613 1727096147.69380: Calling groups_inventory to load vars for managed_node1 12613 1727096147.69382: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096147.69391: Calling all_plugins_play to load vars for managed_node1 12613 1727096147.69393: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096147.69395: Calling groups_plugins_play to load vars for managed_node1 12613 1727096147.69533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096147.69661: done with get_vars() 12613 1727096147.69673: done getting variables 12613 1727096147.69715: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:55:47 -0400 (0:00:00.035) 0:00:11.334 ****** 12613 1727096147.69741: entering _queue_task() for managed_node1/copy 12613 1727096147.69957: worker is 1 (out of 1 available) 12613 1727096147.69972: exiting _queue_task() for managed_node1/copy 12613 1727096147.69984: done queuing things up, now waiting for results queue to drain 12613 1727096147.69985: waiting for pending results... 12613 1727096147.70163: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12613 1727096147.70249: in run() - task 0afff68d-5257-a9dd-d073-00000000017b 12613 1727096147.70262: variable 'ansible_search_path' from source: unknown 12613 1727096147.70265: variable 'ansible_search_path' from source: unknown 12613 1727096147.70295: calling self._execute() 12613 1727096147.70368: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096147.70372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096147.70382: variable 'omit' from source: magic vars 12613 1727096147.70701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096147.72651: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096147.72722: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096147.72749: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096147.72780: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096147.72803: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096147.72862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096147.72885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096147.72905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096147.72934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096147.72945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096147.73058: variable 'ansible_distribution' from source: facts 12613 1727096147.73062: variable 'ansible_distribution_major_version' from source: facts 12613 1727096147.73084: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096147.73088: when evaluation is False, skipping this task 12613 1727096147.73090: _execute() done 12613 1727096147.73093: dumping result to json 12613 1727096147.73095: done dumping result, returning 12613 1727096147.73103: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-a9dd-d073-00000000017b] 12613 1727096147.73105: sending task result for task 0afff68d-5257-a9dd-d073-00000000017b 12613 1727096147.73202: done sending task result for task 0afff68d-5257-a9dd-d073-00000000017b 12613 1727096147.73204: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096147.73281: no more pending results, returning what we have 12613 1727096147.73285: results queue empty 12613 1727096147.73286: checking for any_errors_fatal 12613 1727096147.73291: done checking for any_errors_fatal 12613 1727096147.73292: checking for max_fail_percentage 12613 1727096147.73296: done checking for max_fail_percentage 12613 1727096147.73296: checking to see if all hosts have failed and the running result is not ok 12613 1727096147.73297: done checking to see if all hosts have failed 12613 1727096147.73298: getting the remaining hosts for this loop 12613 1727096147.73300: done getting the remaining hosts for this loop 12613 1727096147.73303: getting the next task for host managed_node1 12613 1727096147.73309: done getting next task for host managed_node1 12613 1727096147.73312: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12613 1727096147.73316: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096147.73333: getting variables 12613 1727096147.73335: in VariableManager get_vars() 12613 1727096147.73383: Calling all_inventory to load vars for managed_node1 12613 1727096147.73386: Calling groups_inventory to load vars for managed_node1 12613 1727096147.73388: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096147.73396: Calling all_plugins_play to load vars for managed_node1 12613 1727096147.73398: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096147.73400: Calling groups_plugins_play to load vars for managed_node1 12613 1727096147.73566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096147.73692: done with get_vars() 12613 1727096147.73700: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:55:47 -0400 (0:00:00.040) 0:00:11.374 ****** 12613 1727096147.73759: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 12613 1727096147.74054: worker is 1 (out of 1 available) 12613 1727096147.74067: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 12613 1727096147.74173: done queuing things up, now waiting for results queue to drain 12613 1727096147.74174: waiting for pending results... 12613 1727096147.74631: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12613 1727096147.74684: in run() - task 0afff68d-5257-a9dd-d073-00000000017c 12613 1727096147.74705: variable 'ansible_search_path' from source: unknown 12613 1727096147.74713: variable 'ansible_search_path' from source: unknown 12613 1727096147.74756: calling self._execute() 12613 1727096147.74852: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096147.74863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096147.74882: variable 'omit' from source: magic vars 12613 1727096147.75735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096147.78436: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096147.78517: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096147.78673: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096147.78677: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096147.78680: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096147.78710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096147.78739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096147.78765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096147.78816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096147.78836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096147.78966: variable 'ansible_distribution' from source: facts 12613 1727096147.78981: variable 'ansible_distribution_major_version' from source: facts 12613 1727096147.79004: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096147.79017: when evaluation is False, skipping this task 12613 1727096147.79072: _execute() done 12613 1727096147.79075: dumping result to json 12613 1727096147.79078: done dumping result, returning 12613 1727096147.79081: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-a9dd-d073-00000000017c] 12613 1727096147.79083: sending task result for task 0afff68d-5257-a9dd-d073-00000000017c 12613 1727096147.79376: done sending task result for task 0afff68d-5257-a9dd-d073-00000000017c 12613 1727096147.79379: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096147.79432: no more pending results, returning what we have 12613 1727096147.79435: results queue empty 12613 1727096147.79436: checking for any_errors_fatal 12613 1727096147.79442: done checking for any_errors_fatal 12613 1727096147.79443: checking for max_fail_percentage 12613 1727096147.79445: done checking for max_fail_percentage 12613 1727096147.79446: checking to see if all hosts have failed and the running result is not ok 12613 1727096147.79447: done checking to see if all hosts have failed 12613 1727096147.79448: getting the remaining hosts for this loop 12613 1727096147.79449: done getting the remaining hosts for this loop 12613 1727096147.79453: getting the next task for host managed_node1 12613 1727096147.79461: done getting next task for host managed_node1 12613 1727096147.79465: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12613 1727096147.79470: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096147.79488: getting variables 12613 1727096147.79490: in VariableManager get_vars() 12613 1727096147.79541: Calling all_inventory to load vars for managed_node1 12613 1727096147.79544: Calling groups_inventory to load vars for managed_node1 12613 1727096147.79546: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096147.79556: Calling all_plugins_play to load vars for managed_node1 12613 1727096147.79559: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096147.79562: Calling groups_plugins_play to load vars for managed_node1 12613 1727096147.79739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096147.80089: done with get_vars() 12613 1727096147.80110: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:55:47 -0400 (0:00:00.064) 0:00:11.439 ****** 12613 1727096147.80198: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 12613 1727096147.80484: worker is 1 (out of 1 available) 12613 1727096147.80497: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 12613 1727096147.80510: done queuing things up, now waiting for results queue to drain 12613 1727096147.80511: waiting for pending results... 12613 1727096147.80810: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 12613 1727096147.80957: in run() - task 0afff68d-5257-a9dd-d073-00000000017d 12613 1727096147.80980: variable 'ansible_search_path' from source: unknown 12613 1727096147.80988: variable 'ansible_search_path' from source: unknown 12613 1727096147.81032: calling self._execute() 12613 1727096147.81381: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096147.81385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096147.81388: variable 'omit' from source: magic vars 12613 1727096147.82200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096147.85024: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096147.85152: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096147.85198: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096147.85474: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096147.85478: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096147.85550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096147.85614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096147.85719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096147.85764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096147.85814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096147.86001: variable 'ansible_distribution' from source: facts 12613 1727096147.86105: variable 'ansible_distribution_major_version' from source: facts 12613 1727096147.86175: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096147.86202: when evaluation is False, skipping this task 12613 1727096147.86214: _execute() done 12613 1727096147.86221: dumping result to json 12613 1727096147.86234: done dumping result, returning 12613 1727096147.86247: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-a9dd-d073-00000000017d] 12613 1727096147.86258: sending task result for task 0afff68d-5257-a9dd-d073-00000000017d 12613 1727096147.86512: done sending task result for task 0afff68d-5257-a9dd-d073-00000000017d 12613 1727096147.86516: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096147.86569: no more pending results, returning what we have 12613 1727096147.86574: results queue empty 12613 1727096147.86575: checking for any_errors_fatal 12613 1727096147.86582: done checking for any_errors_fatal 12613 1727096147.86583: checking for max_fail_percentage 12613 1727096147.86585: done checking for max_fail_percentage 12613 1727096147.86586: checking to see if all hosts have failed and the running result is not ok 12613 1727096147.86587: done checking to see if all hosts have failed 12613 1727096147.86587: getting the remaining hosts for this loop 12613 1727096147.86589: done getting the remaining hosts for this loop 12613 1727096147.86593: getting the next task for host managed_node1 12613 1727096147.86601: done getting next task for host managed_node1 12613 1727096147.86605: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12613 1727096147.86609: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096147.86631: getting variables 12613 1727096147.86633: in VariableManager get_vars() 12613 1727096147.86695: Calling all_inventory to load vars for managed_node1 12613 1727096147.86699: Calling groups_inventory to load vars for managed_node1 12613 1727096147.86701: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096147.86714: Calling all_plugins_play to load vars for managed_node1 12613 1727096147.86717: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096147.86720: Calling groups_plugins_play to load vars for managed_node1 12613 1727096147.87243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096147.87526: done with get_vars() 12613 1727096147.87537: done getting variables 12613 1727096147.87592: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:55:47 -0400 (0:00:00.074) 0:00:11.513 ****** 12613 1727096147.87624: entering _queue_task() for managed_node1/debug 12613 1727096147.87903: worker is 1 (out of 1 available) 12613 1727096147.87916: exiting _queue_task() for managed_node1/debug 12613 1727096147.87927: done queuing things up, now waiting for results queue to drain 12613 1727096147.87928: waiting for pending results... 12613 1727096147.88379: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12613 1727096147.88453: in run() - task 0afff68d-5257-a9dd-d073-00000000017e 12613 1727096147.88479: variable 'ansible_search_path' from source: unknown 12613 1727096147.88489: variable 'ansible_search_path' from source: unknown 12613 1727096147.88529: calling self._execute() 12613 1727096147.88624: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096147.88636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096147.88650: variable 'omit' from source: magic vars 12613 1727096147.89217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096147.91674: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096147.91678: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096147.91705: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096147.91743: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096147.91777: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096147.91862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096147.91903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096147.91932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096147.91976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096147.92123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096147.92675: variable 'ansible_distribution' from source: facts 12613 1727096147.92678: variable 'ansible_distribution_major_version' from source: facts 12613 1727096147.92681: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096147.92683: when evaluation is False, skipping this task 12613 1727096147.92685: _execute() done 12613 1727096147.92687: dumping result to json 12613 1727096147.92689: done dumping result, returning 12613 1727096147.92691: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-a9dd-d073-00000000017e] 12613 1727096147.92694: sending task result for task 0afff68d-5257-a9dd-d073-00000000017e 12613 1727096147.92762: done sending task result for task 0afff68d-5257-a9dd-d073-00000000017e 12613 1727096147.92766: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096147.92822: no more pending results, returning what we have 12613 1727096147.92826: results queue empty 12613 1727096147.92827: checking for any_errors_fatal 12613 1727096147.92834: done checking for any_errors_fatal 12613 1727096147.92835: checking for max_fail_percentage 12613 1727096147.92838: done checking for max_fail_percentage 12613 1727096147.92839: checking to see if all hosts have failed and the running result is not ok 12613 1727096147.92839: done checking to see if all hosts have failed 12613 1727096147.92840: getting the remaining hosts for this loop 12613 1727096147.92841: done getting the remaining hosts for this loop 12613 1727096147.92845: getting the next task for host managed_node1 12613 1727096147.92852: done getting next task for host managed_node1 12613 1727096147.92856: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12613 1727096147.92860: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096147.92881: getting variables 12613 1727096147.92883: in VariableManager get_vars() 12613 1727096147.92938: Calling all_inventory to load vars for managed_node1 12613 1727096147.92941: Calling groups_inventory to load vars for managed_node1 12613 1727096147.92944: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096147.92954: Calling all_plugins_play to load vars for managed_node1 12613 1727096147.92958: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096147.92961: Calling groups_plugins_play to load vars for managed_node1 12613 1727096147.93546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096147.93822: done with get_vars() 12613 1727096147.93835: done getting variables 12613 1727096147.93939: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:55:47 -0400 (0:00:00.063) 0:00:11.577 ****** 12613 1727096147.93999: entering _queue_task() for managed_node1/debug 12613 1727096147.94433: worker is 1 (out of 1 available) 12613 1727096147.94446: exiting _queue_task() for managed_node1/debug 12613 1727096147.94460: done queuing things up, now waiting for results queue to drain 12613 1727096147.94462: waiting for pending results... 12613 1727096147.94740: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12613 1727096147.94881: in run() - task 0afff68d-5257-a9dd-d073-00000000017f 12613 1727096147.94905: variable 'ansible_search_path' from source: unknown 12613 1727096147.94912: variable 'ansible_search_path' from source: unknown 12613 1727096147.94959: calling self._execute() 12613 1727096147.95054: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096147.95064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096147.95081: variable 'omit' from source: magic vars 12613 1727096147.95624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096147.98218: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096147.98294: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096147.98344: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096147.98387: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096147.98417: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096147.98505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096147.98539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096147.98577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096147.98622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096147.98641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096147.98783: variable 'ansible_distribution' from source: facts 12613 1727096147.98974: variable 'ansible_distribution_major_version' from source: facts 12613 1727096147.98977: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096147.98980: when evaluation is False, skipping this task 12613 1727096147.98982: _execute() done 12613 1727096147.98984: dumping result to json 12613 1727096147.98986: done dumping result, returning 12613 1727096147.98989: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-a9dd-d073-00000000017f] 12613 1727096147.98991: sending task result for task 0afff68d-5257-a9dd-d073-00000000017f 12613 1727096147.99064: done sending task result for task 0afff68d-5257-a9dd-d073-00000000017f 12613 1727096147.99069: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096147.99113: no more pending results, returning what we have 12613 1727096147.99117: results queue empty 12613 1727096147.99118: checking for any_errors_fatal 12613 1727096147.99124: done checking for any_errors_fatal 12613 1727096147.99125: checking for max_fail_percentage 12613 1727096147.99126: done checking for max_fail_percentage 12613 1727096147.99127: checking to see if all hosts have failed and the running result is not ok 12613 1727096147.99128: done checking to see if all hosts have failed 12613 1727096147.99128: getting the remaining hosts for this loop 12613 1727096147.99129: done getting the remaining hosts for this loop 12613 1727096147.99133: getting the next task for host managed_node1 12613 1727096147.99140: done getting next task for host managed_node1 12613 1727096147.99144: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12613 1727096147.99148: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096147.99171: getting variables 12613 1727096147.99173: in VariableManager get_vars() 12613 1727096147.99224: Calling all_inventory to load vars for managed_node1 12613 1727096147.99227: Calling groups_inventory to load vars for managed_node1 12613 1727096147.99229: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096147.99239: Calling all_plugins_play to load vars for managed_node1 12613 1727096147.99241: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096147.99243: Calling groups_plugins_play to load vars for managed_node1 12613 1727096147.99780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096148.00039: done with get_vars() 12613 1727096148.00052: done getting variables 12613 1727096148.00116: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:55:48 -0400 (0:00:00.061) 0:00:11.638 ****** 12613 1727096148.00151: entering _queue_task() for managed_node1/debug 12613 1727096148.00605: worker is 1 (out of 1 available) 12613 1727096148.00617: exiting _queue_task() for managed_node1/debug 12613 1727096148.00629: done queuing things up, now waiting for results queue to drain 12613 1727096148.00631: waiting for pending results... 12613 1727096148.00856: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12613 1727096148.01135: in run() - task 0afff68d-5257-a9dd-d073-000000000180 12613 1727096148.01139: variable 'ansible_search_path' from source: unknown 12613 1727096148.01142: variable 'ansible_search_path' from source: unknown 12613 1727096148.01177: calling self._execute() 12613 1727096148.01300: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096148.01304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096148.01311: variable 'omit' from source: magic vars 12613 1727096148.01876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096148.03507: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096148.03563: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096148.03594: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096148.03620: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096148.03642: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096148.03706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096148.03728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096148.03747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096148.03779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096148.03790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096148.03927: variable 'ansible_distribution' from source: facts 12613 1727096148.03930: variable 'ansible_distribution_major_version' from source: facts 12613 1727096148.03933: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096148.03935: when evaluation is False, skipping this task 12613 1727096148.03937: _execute() done 12613 1727096148.03939: dumping result to json 12613 1727096148.03941: done dumping result, returning 12613 1727096148.03944: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-a9dd-d073-000000000180] 12613 1727096148.03948: sending task result for task 0afff68d-5257-a9dd-d073-000000000180 skipping: [managed_node1] => { "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)" } 12613 1727096148.04122: no more pending results, returning what we have 12613 1727096148.04125: results queue empty 12613 1727096148.04126: checking for any_errors_fatal 12613 1727096148.04134: done checking for any_errors_fatal 12613 1727096148.04135: checking for max_fail_percentage 12613 1727096148.04136: done checking for max_fail_percentage 12613 1727096148.04137: checking to see if all hosts have failed and the running result is not ok 12613 1727096148.04138: done checking to see if all hosts have failed 12613 1727096148.04138: getting the remaining hosts for this loop 12613 1727096148.04139: done getting the remaining hosts for this loop 12613 1727096148.04143: getting the next task for host managed_node1 12613 1727096148.04151: done getting next task for host managed_node1 12613 1727096148.04155: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12613 1727096148.04158: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096148.04172: done sending task result for task 0afff68d-5257-a9dd-d073-000000000180 12613 1727096148.04175: WORKER PROCESS EXITING 12613 1727096148.04316: getting variables 12613 1727096148.04318: in VariableManager get_vars() 12613 1727096148.04362: Calling all_inventory to load vars for managed_node1 12613 1727096148.04365: Calling groups_inventory to load vars for managed_node1 12613 1727096148.04369: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096148.04394: Calling all_plugins_play to load vars for managed_node1 12613 1727096148.04397: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096148.04401: Calling groups_plugins_play to load vars for managed_node1 12613 1727096148.04614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096148.04836: done with get_vars() 12613 1727096148.04856: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:55:48 -0400 (0:00:00.048) 0:00:11.686 ****** 12613 1727096148.04989: entering _queue_task() for managed_node1/ping 12613 1727096148.05262: worker is 1 (out of 1 available) 12613 1727096148.05284: exiting _queue_task() for managed_node1/ping 12613 1727096148.05295: done queuing things up, now waiting for results queue to drain 12613 1727096148.05297: waiting for pending results... 12613 1727096148.05484: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 12613 1727096148.05573: in run() - task 0afff68d-5257-a9dd-d073-000000000181 12613 1727096148.05586: variable 'ansible_search_path' from source: unknown 12613 1727096148.05589: variable 'ansible_search_path' from source: unknown 12613 1727096148.05621: calling self._execute() 12613 1727096148.05689: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096148.05692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096148.05702: variable 'omit' from source: magic vars 12613 1727096148.06014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096148.07778: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096148.07827: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096148.08073: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096148.08077: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096148.08079: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096148.08082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096148.08085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096148.08087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096148.08130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096148.08148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096148.08288: variable 'ansible_distribution' from source: facts 12613 1727096148.08300: variable 'ansible_distribution_major_version' from source: facts 12613 1727096148.08322: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096148.08330: when evaluation is False, skipping this task 12613 1727096148.08337: _execute() done 12613 1727096148.08344: dumping result to json 12613 1727096148.08355: done dumping result, returning 12613 1727096148.08371: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-a9dd-d073-000000000181] 12613 1727096148.08381: sending task result for task 0afff68d-5257-a9dd-d073-000000000181 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096148.08527: no more pending results, returning what we have 12613 1727096148.08531: results queue empty 12613 1727096148.08533: checking for any_errors_fatal 12613 1727096148.08539: done checking for any_errors_fatal 12613 1727096148.08540: checking for max_fail_percentage 12613 1727096148.08542: done checking for max_fail_percentage 12613 1727096148.08543: checking to see if all hosts have failed and the running result is not ok 12613 1727096148.08544: done checking to see if all hosts have failed 12613 1727096148.08545: getting the remaining hosts for this loop 12613 1727096148.08546: done getting the remaining hosts for this loop 12613 1727096148.08550: getting the next task for host managed_node1 12613 1727096148.08565: done getting next task for host managed_node1 12613 1727096148.08571: ^ task is: TASK: meta (role_complete) 12613 1727096148.08574: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096148.08606: getting variables 12613 1727096148.08608: in VariableManager get_vars() 12613 1727096148.08671: Calling all_inventory to load vars for managed_node1 12613 1727096148.08674: Calling groups_inventory to load vars for managed_node1 12613 1727096148.08676: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096148.08686: Calling all_plugins_play to load vars for managed_node1 12613 1727096148.08688: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096148.08691: Calling groups_plugins_play to load vars for managed_node1 12613 1727096148.08702: done sending task result for task 0afff68d-5257-a9dd-d073-000000000181 12613 1727096148.08705: WORKER PROCESS EXITING 12613 1727096148.08925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096148.09076: done with get_vars() 12613 1727096148.09085: done getting variables 12613 1727096148.09141: done queuing things up, now waiting for results queue to drain 12613 1727096148.09143: results queue empty 12613 1727096148.09144: checking for any_errors_fatal 12613 1727096148.09146: done checking for any_errors_fatal 12613 1727096148.09146: checking for max_fail_percentage 12613 1727096148.09147: done checking for max_fail_percentage 12613 1727096148.09148: checking to see if all hosts have failed and the running result is not ok 12613 1727096148.09148: done checking to see if all hosts have failed 12613 1727096148.09148: getting the remaining hosts for this loop 12613 1727096148.09149: done getting the remaining hosts for this loop 12613 1727096148.09151: getting the next task for host managed_node1 12613 1727096148.09156: done getting next task for host managed_node1 12613 1727096148.09157: ^ task is: TASK: Delete the device '{{ controller_device }}' 12613 1727096148.09158: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096148.09160: getting variables 12613 1727096148.09161: in VariableManager get_vars() 12613 1727096148.09180: Calling all_inventory to load vars for managed_node1 12613 1727096148.09181: Calling groups_inventory to load vars for managed_node1 12613 1727096148.09183: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096148.09186: Calling all_plugins_play to load vars for managed_node1 12613 1727096148.09187: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096148.09189: Calling groups_plugins_play to load vars for managed_node1 12613 1727096148.09274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096148.09384: done with get_vars() 12613 1727096148.09391: done getting variables 12613 1727096148.09417: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 12613 1727096148.09510: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:242 Monday 23 September 2024 08:55:48 -0400 (0:00:00.045) 0:00:11.732 ****** 12613 1727096148.09531: entering _queue_task() for managed_node1/command 12613 1727096148.09759: worker is 1 (out of 1 available) 12613 1727096148.09775: exiting _queue_task() for managed_node1/command 12613 1727096148.09786: done queuing things up, now waiting for results queue to drain 12613 1727096148.09787: waiting for pending results... 12613 1727096148.09959: running TaskExecutor() for managed_node1/TASK: Delete the device 'nm-bond' 12613 1727096148.10033: in run() - task 0afff68d-5257-a9dd-d073-0000000001b1 12613 1727096148.10044: variable 'ansible_search_path' from source: unknown 12613 1727096148.10075: calling self._execute() 12613 1727096148.10146: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096148.10150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096148.10161: variable 'omit' from source: magic vars 12613 1727096148.10605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096148.12720: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096148.12772: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096148.12799: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096148.12824: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096148.12843: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096148.12910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096148.12931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096148.12948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096148.12977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096148.12993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096148.13084: variable 'ansible_distribution' from source: facts 12613 1727096148.13087: variable 'ansible_distribution_major_version' from source: facts 12613 1727096148.13106: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096148.13109: when evaluation is False, skipping this task 12613 1727096148.13112: _execute() done 12613 1727096148.13114: dumping result to json 12613 1727096148.13117: done dumping result, returning 12613 1727096148.13124: done running TaskExecutor() for managed_node1/TASK: Delete the device 'nm-bond' [0afff68d-5257-a9dd-d073-0000000001b1] 12613 1727096148.13127: sending task result for task 0afff68d-5257-a9dd-d073-0000000001b1 12613 1727096148.13215: done sending task result for task 0afff68d-5257-a9dd-d073-0000000001b1 12613 1727096148.13218: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096148.13276: no more pending results, returning what we have 12613 1727096148.13279: results queue empty 12613 1727096148.13280: checking for any_errors_fatal 12613 1727096148.13281: done checking for any_errors_fatal 12613 1727096148.13282: checking for max_fail_percentage 12613 1727096148.13283: done checking for max_fail_percentage 12613 1727096148.13284: checking to see if all hosts have failed and the running result is not ok 12613 1727096148.13285: done checking to see if all hosts have failed 12613 1727096148.13286: getting the remaining hosts for this loop 12613 1727096148.13287: done getting the remaining hosts for this loop 12613 1727096148.13291: getting the next task for host managed_node1 12613 1727096148.13298: done getting next task for host managed_node1 12613 1727096148.13301: ^ task is: TASK: Remove test interfaces 12613 1727096148.13304: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096148.13308: getting variables 12613 1727096148.13309: in VariableManager get_vars() 12613 1727096148.13365: Calling all_inventory to load vars for managed_node1 12613 1727096148.13369: Calling groups_inventory to load vars for managed_node1 12613 1727096148.13372: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096148.13382: Calling all_plugins_play to load vars for managed_node1 12613 1727096148.13384: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096148.13386: Calling groups_plugins_play to load vars for managed_node1 12613 1727096148.13581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096148.13702: done with get_vars() 12613 1727096148.13710: done getting variables 12613 1727096148.13751: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Monday 23 September 2024 08:55:48 -0400 (0:00:00.042) 0:00:11.775 ****** 12613 1727096148.13782: entering _queue_task() for managed_node1/shell 12613 1727096148.14008: worker is 1 (out of 1 available) 12613 1727096148.14023: exiting _queue_task() for managed_node1/shell 12613 1727096148.14035: done queuing things up, now waiting for results queue to drain 12613 1727096148.14037: waiting for pending results... 12613 1727096148.14384: running TaskExecutor() for managed_node1/TASK: Remove test interfaces 12613 1727096148.14390: in run() - task 0afff68d-5257-a9dd-d073-0000000001b5 12613 1727096148.14394: variable 'ansible_search_path' from source: unknown 12613 1727096148.14397: variable 'ansible_search_path' from source: unknown 12613 1727096148.14408: calling self._execute() 12613 1727096148.14506: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096148.14517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096148.14532: variable 'omit' from source: magic vars 12613 1727096148.14954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096148.16702: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096148.16759: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096148.16791: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096148.16816: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096148.16835: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096148.16903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096148.16923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096148.16941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096148.16970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096148.16984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096148.17080: variable 'ansible_distribution' from source: facts 12613 1727096148.17084: variable 'ansible_distribution_major_version' from source: facts 12613 1727096148.17103: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096148.17107: when evaluation is False, skipping this task 12613 1727096148.17110: _execute() done 12613 1727096148.17112: dumping result to json 12613 1727096148.17114: done dumping result, returning 12613 1727096148.17121: done running TaskExecutor() for managed_node1/TASK: Remove test interfaces [0afff68d-5257-a9dd-d073-0000000001b5] 12613 1727096148.17126: sending task result for task 0afff68d-5257-a9dd-d073-0000000001b5 12613 1727096148.17215: done sending task result for task 0afff68d-5257-a9dd-d073-0000000001b5 12613 1727096148.17218: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096148.17271: no more pending results, returning what we have 12613 1727096148.17275: results queue empty 12613 1727096148.17276: checking for any_errors_fatal 12613 1727096148.17280: done checking for any_errors_fatal 12613 1727096148.17281: checking for max_fail_percentage 12613 1727096148.17283: done checking for max_fail_percentage 12613 1727096148.17284: checking to see if all hosts have failed and the running result is not ok 12613 1727096148.17284: done checking to see if all hosts have failed 12613 1727096148.17285: getting the remaining hosts for this loop 12613 1727096148.17286: done getting the remaining hosts for this loop 12613 1727096148.17290: getting the next task for host managed_node1 12613 1727096148.17298: done getting next task for host managed_node1 12613 1727096148.17300: ^ task is: TASK: Stop dnsmasq/radvd services 12613 1727096148.17303: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096148.17307: getting variables 12613 1727096148.17308: in VariableManager get_vars() 12613 1727096148.17371: Calling all_inventory to load vars for managed_node1 12613 1727096148.17375: Calling groups_inventory to load vars for managed_node1 12613 1727096148.17377: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096148.17387: Calling all_plugins_play to load vars for managed_node1 12613 1727096148.17390: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096148.17392: Calling groups_plugins_play to load vars for managed_node1 12613 1727096148.17538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096148.17669: done with get_vars() 12613 1727096148.17680: done getting variables 12613 1727096148.17725: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Monday 23 September 2024 08:55:48 -0400 (0:00:00.039) 0:00:11.814 ****** 12613 1727096148.17771: entering _queue_task() for managed_node1/shell 12613 1727096148.18133: worker is 1 (out of 1 available) 12613 1727096148.18145: exiting _queue_task() for managed_node1/shell 12613 1727096148.18159: done queuing things up, now waiting for results queue to drain 12613 1727096148.18161: waiting for pending results... 12613 1727096148.18560: running TaskExecutor() for managed_node1/TASK: Stop dnsmasq/radvd services 12613 1727096148.18615: in run() - task 0afff68d-5257-a9dd-d073-0000000001b6 12613 1727096148.18663: variable 'ansible_search_path' from source: unknown 12613 1727096148.18687: variable 'ansible_search_path' from source: unknown 12613 1727096148.18738: calling self._execute() 12613 1727096148.18874: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096148.18886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096148.18893: variable 'omit' from source: magic vars 12613 1727096148.19221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096148.20843: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096148.20999: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096148.21003: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096148.21022: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096148.21055: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096148.21186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096148.21221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096148.21260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096148.21413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096148.21435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096148.21773: variable 'ansible_distribution' from source: facts 12613 1727096148.21776: variable 'ansible_distribution_major_version' from source: facts 12613 1727096148.21780: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096148.21783: when evaluation is False, skipping this task 12613 1727096148.21785: _execute() done 12613 1727096148.21788: dumping result to json 12613 1727096148.21791: done dumping result, returning 12613 1727096148.21794: done running TaskExecutor() for managed_node1/TASK: Stop dnsmasq/radvd services [0afff68d-5257-a9dd-d073-0000000001b6] 12613 1727096148.21796: sending task result for task 0afff68d-5257-a9dd-d073-0000000001b6 12613 1727096148.21872: done sending task result for task 0afff68d-5257-a9dd-d073-0000000001b6 skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096148.21928: no more pending results, returning what we have 12613 1727096148.21932: results queue empty 12613 1727096148.21933: checking for any_errors_fatal 12613 1727096148.21940: done checking for any_errors_fatal 12613 1727096148.21940: checking for max_fail_percentage 12613 1727096148.21942: done checking for max_fail_percentage 12613 1727096148.21943: checking to see if all hosts have failed and the running result is not ok 12613 1727096148.21945: done checking to see if all hosts have failed 12613 1727096148.21945: getting the remaining hosts for this loop 12613 1727096148.21946: done getting the remaining hosts for this loop 12613 1727096148.21950: getting the next task for host managed_node1 12613 1727096148.21962: done getting next task for host managed_node1 12613 1727096148.21965: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 12613 1727096148.21969: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096148.21974: getting variables 12613 1727096148.21975: in VariableManager get_vars() 12613 1727096148.22027: Calling all_inventory to load vars for managed_node1 12613 1727096148.22030: Calling groups_inventory to load vars for managed_node1 12613 1727096148.22032: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096148.22179: Calling all_plugins_play to load vars for managed_node1 12613 1727096148.22183: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096148.22186: Calling groups_plugins_play to load vars for managed_node1 12613 1727096148.22437: WORKER PROCESS EXITING 12613 1727096148.22461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096148.22659: done with get_vars() 12613 1727096148.22673: done getting variables 12613 1727096148.22730: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:248 Monday 23 September 2024 08:55:48 -0400 (0:00:00.049) 0:00:11.864 ****** 12613 1727096148.22760: entering _queue_task() for managed_node1/command 12613 1727096148.23098: worker is 1 (out of 1 available) 12613 1727096148.23113: exiting _queue_task() for managed_node1/command 12613 1727096148.23124: done queuing things up, now waiting for results queue to drain 12613 1727096148.23126: waiting for pending results... 12613 1727096148.23344: running TaskExecutor() for managed_node1/TASK: Restore the /etc/resolv.conf for initscript 12613 1727096148.23420: in run() - task 0afff68d-5257-a9dd-d073-0000000001b7 12613 1727096148.23433: variable 'ansible_search_path' from source: unknown 12613 1727096148.23463: calling self._execute() 12613 1727096148.23542: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096148.23545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096148.23558: variable 'omit' from source: magic vars 12613 1727096148.23885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096148.25669: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096148.25836: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096148.25841: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096148.25857: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096148.25902: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096148.26026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096148.26054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096148.26070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096148.26097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096148.26107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096148.26223: variable 'ansible_distribution' from source: facts 12613 1727096148.26226: variable 'ansible_distribution_major_version' from source: facts 12613 1727096148.26245: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096148.26248: when evaluation is False, skipping this task 12613 1727096148.26250: _execute() done 12613 1727096148.26255: dumping result to json 12613 1727096148.26257: done dumping result, returning 12613 1727096148.26275: done running TaskExecutor() for managed_node1/TASK: Restore the /etc/resolv.conf for initscript [0afff68d-5257-a9dd-d073-0000000001b7] 12613 1727096148.26278: sending task result for task 0afff68d-5257-a9dd-d073-0000000001b7 12613 1727096148.26361: done sending task result for task 0afff68d-5257-a9dd-d073-0000000001b7 12613 1727096148.26363: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096148.26418: no more pending results, returning what we have 12613 1727096148.26422: results queue empty 12613 1727096148.26423: checking for any_errors_fatal 12613 1727096148.26428: done checking for any_errors_fatal 12613 1727096148.26429: checking for max_fail_percentage 12613 1727096148.26431: done checking for max_fail_percentage 12613 1727096148.26431: checking to see if all hosts have failed and the running result is not ok 12613 1727096148.26432: done checking to see if all hosts have failed 12613 1727096148.26433: getting the remaining hosts for this loop 12613 1727096148.26435: done getting the remaining hosts for this loop 12613 1727096148.26438: getting the next task for host managed_node1 12613 1727096148.26446: done getting next task for host managed_node1 12613 1727096148.26449: ^ task is: TASK: Verify network state restored to default 12613 1727096148.26454: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12613 1727096148.26458: getting variables 12613 1727096148.26459: in VariableManager get_vars() 12613 1727096148.26515: Calling all_inventory to load vars for managed_node1 12613 1727096148.26518: Calling groups_inventory to load vars for managed_node1 12613 1727096148.26520: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096148.26531: Calling all_plugins_play to load vars for managed_node1 12613 1727096148.26533: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096148.26536: Calling groups_plugins_play to load vars for managed_node1 12613 1727096148.26693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096148.26819: done with get_vars() 12613 1727096148.26828: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:253 Monday 23 September 2024 08:55:48 -0400 (0:00:00.041) 0:00:11.906 ****** 12613 1727096148.26903: entering _queue_task() for managed_node1/include_tasks 12613 1727096148.27141: worker is 1 (out of 1 available) 12613 1727096148.27157: exiting _queue_task() for managed_node1/include_tasks 12613 1727096148.27171: done queuing things up, now waiting for results queue to drain 12613 1727096148.27172: waiting for pending results... 12613 1727096148.27344: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 12613 1727096148.27418: in run() - task 0afff68d-5257-a9dd-d073-0000000001b8 12613 1727096148.27430: variable 'ansible_search_path' from source: unknown 12613 1727096148.27461: calling self._execute() 12613 1727096148.27537: variable 'ansible_host' from source: host vars for 'managed_node1' 12613 1727096148.27541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12613 1727096148.27551: variable 'omit' from source: magic vars 12613 1727096148.27872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12613 1727096148.29955: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12613 1727096148.29999: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12613 1727096148.30027: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12613 1727096148.30061: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12613 1727096148.30078: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12613 1727096148.30140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12613 1727096148.30165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12613 1727096148.30185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12613 1727096148.30210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12613 1727096148.30220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12613 1727096148.30322: variable 'ansible_distribution' from source: facts 12613 1727096148.30326: variable 'ansible_distribution_major_version' from source: facts 12613 1727096148.30342: Evaluated conditional ((ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)): False 12613 1727096148.30345: when evaluation is False, skipping this task 12613 1727096148.30348: _execute() done 12613 1727096148.30350: dumping result to json 12613 1727096148.30355: done dumping result, returning 12613 1727096148.30361: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [0afff68d-5257-a9dd-d073-0000000001b8] 12613 1727096148.30366: sending task result for task 0afff68d-5257-a9dd-d073-0000000001b8 12613 1727096148.30463: done sending task result for task 0afff68d-5257-a9dd-d073-0000000001b8 12613 1727096148.30466: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(ansible_distribution in ['CentOS','RedHat'] and ansible_distribution_major_version | int < 9)", "skip_reason": "Conditional result was False" } 12613 1727096148.30527: no more pending results, returning what we have 12613 1727096148.30530: results queue empty 12613 1727096148.30532: checking for any_errors_fatal 12613 1727096148.30536: done checking for any_errors_fatal 12613 1727096148.30537: checking for max_fail_percentage 12613 1727096148.30538: done checking for max_fail_percentage 12613 1727096148.30539: checking to see if all hosts have failed and the running result is not ok 12613 1727096148.30540: done checking to see if all hosts have failed 12613 1727096148.30541: getting the remaining hosts for this loop 12613 1727096148.30542: done getting the remaining hosts for this loop 12613 1727096148.30546: getting the next task for host managed_node1 12613 1727096148.30557: done getting next task for host managed_node1 12613 1727096148.30559: ^ task is: TASK: meta (flush_handlers) 12613 1727096148.30561: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096148.30565: getting variables 12613 1727096148.30569: in VariableManager get_vars() 12613 1727096148.30621: Calling all_inventory to load vars for managed_node1 12613 1727096148.30623: Calling groups_inventory to load vars for managed_node1 12613 1727096148.30625: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096148.30635: Calling all_plugins_play to load vars for managed_node1 12613 1727096148.30638: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096148.30640: Calling groups_plugins_play to load vars for managed_node1 12613 1727096148.30830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096148.30962: done with get_vars() 12613 1727096148.30973: done getting variables 12613 1727096148.31023: in VariableManager get_vars() 12613 1727096148.31038: Calling all_inventory to load vars for managed_node1 12613 1727096148.31039: Calling groups_inventory to load vars for managed_node1 12613 1727096148.31041: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096148.31044: Calling all_plugins_play to load vars for managed_node1 12613 1727096148.31045: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096148.31047: Calling groups_plugins_play to load vars for managed_node1 12613 1727096148.31132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096148.31335: done with get_vars() 12613 1727096148.31349: done queuing things up, now waiting for results queue to drain 12613 1727096148.31351: results queue empty 12613 1727096148.31354: checking for any_errors_fatal 12613 1727096148.31356: done checking for any_errors_fatal 12613 1727096148.31357: checking for max_fail_percentage 12613 1727096148.31358: done checking for max_fail_percentage 12613 1727096148.31359: checking to see if all hosts have failed and the running result is not ok 12613 1727096148.31360: done checking to see if all hosts have failed 12613 1727096148.31361: getting the remaining hosts for this loop 12613 1727096148.31362: done getting the remaining hosts for this loop 12613 1727096148.31365: getting the next task for host managed_node1 12613 1727096148.31372: done getting next task for host managed_node1 12613 1727096148.31374: ^ task is: TASK: meta (flush_handlers) 12613 1727096148.31375: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096148.31378: getting variables 12613 1727096148.31379: in VariableManager get_vars() 12613 1727096148.31403: Calling all_inventory to load vars for managed_node1 12613 1727096148.31406: Calling groups_inventory to load vars for managed_node1 12613 1727096148.31407: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096148.31418: Calling all_plugins_play to load vars for managed_node1 12613 1727096148.31421: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096148.31424: Calling groups_plugins_play to load vars for managed_node1 12613 1727096148.31614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096148.31826: done with get_vars() 12613 1727096148.31835: done getting variables 12613 1727096148.31888: in VariableManager get_vars() 12613 1727096148.31914: Calling all_inventory to load vars for managed_node1 12613 1727096148.31916: Calling groups_inventory to load vars for managed_node1 12613 1727096148.31918: Calling all_plugins_inventory to load vars for managed_node1 12613 1727096148.31923: Calling all_plugins_play to load vars for managed_node1 12613 1727096148.31925: Calling groups_plugins_inventory to load vars for managed_node1 12613 1727096148.31927: Calling groups_plugins_play to load vars for managed_node1 12613 1727096148.32222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12613 1727096148.32439: done with get_vars() 12613 1727096148.32456: done queuing things up, now waiting for results queue to drain 12613 1727096148.32458: results queue empty 12613 1727096148.32459: checking for any_errors_fatal 12613 1727096148.32460: done checking for any_errors_fatal 12613 1727096148.32461: checking for max_fail_percentage 12613 1727096148.32462: done checking for max_fail_percentage 12613 1727096148.32463: checking to see if all hosts have failed and the running result is not ok 12613 1727096148.32464: done checking to see if all hosts have failed 12613 1727096148.32464: getting the remaining hosts for this loop 12613 1727096148.32465: done getting the remaining hosts for this loop 12613 1727096148.32510: getting the next task for host managed_node1 12613 1727096148.32514: done getting next task for host managed_node1 12613 1727096148.32515: ^ task is: None 12613 1727096148.32517: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12613 1727096148.32518: done queuing things up, now waiting for results queue to drain 12613 1727096148.32519: results queue empty 12613 1727096148.32520: checking for any_errors_fatal 12613 1727096148.32521: done checking for any_errors_fatal 12613 1727096148.32521: checking for max_fail_percentage 12613 1727096148.32522: done checking for max_fail_percentage 12613 1727096148.32523: checking to see if all hosts have failed and the running result is not ok 12613 1727096148.32524: done checking to see if all hosts have failed 12613 1727096148.32526: getting the next task for host managed_node1 12613 1727096148.32529: done getting next task for host managed_node1 12613 1727096148.32530: ^ task is: None 12613 1727096148.32531: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=7 changed=0 unreachable=0 failed=0 skipped=151 rescued=0 ignored=0 Monday 23 September 2024 08:55:48 -0400 (0:00:00.057) 0:00:11.963 ****** =============================================================================== Gathering Facts --------------------------------------------------------- 1.82s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_initscripts.yml:5 Gather the minimum subset of ansible_facts required by the network role test --- 0.83s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.75s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Show debug messages for the network_state --- 0.11s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.11s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Include the task 'assert_profile_present.yml' --------------------------- 0.11s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:67 fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces --- 0.10s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Assert that the interface is present - 'nm-bond' ------------------------ 0.10s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.09s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 fedora.linux_system_roles.network : Show debug messages for the network_connections --- 0.09s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.09s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces --- 0.09s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 fedora.linux_system_roles.network : Enable network service -------------- 0.09s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider --- 0.08s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 fedora.linux_system_roles.network : Ensure initscripts network file dependency is present --- 0.08s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 fedora.linux_system_roles.network : Show stderr messages for the network_connections --- 0.08s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 ** TEST check IPv4 ------------------------------------------------------ 0.08s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:80 fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable --- 0.08s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 fedora.linux_system_roles.network : Configure networking state ---------- 0.08s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Include the task 'get_interface_stat.yml' ------------------------------- 0.08s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 12613 1727096148.32663: RUNNING CLEANUP